Archives for May 2017

OPEN Government Data Act moves to Senate floor after markup

By: Jonathan Haggerty 

Legislation requiring federal agencies to publish their data online in a searchable, nonproprietary, machine-readable format has been cleared for the Senate following a May 17 markup by the Senate Homeland Security and Governmental Affairs Committee.

Sponsored by Sen. Brian Schatz, D-Hawaii, S. 760, the Open Public Electronic and Necessary Government Data Act is identical to an earlier Schatz bill that passed the Senate unanimously last year after analysis by the Congressional Budget Office determined it wouldn’t cost taxpayers any money.

What it would do is modernize government agencies and increase their effectiveness, while also allowing taxpayers to see how their money is spent. For these reasons, R Street joined more than 80 organizations—including trade groups, businesses and other civil-society organizations—in urging the Senate committee to pass these badly needed reforms.

The status quo makes it difficult for engaged citizens to view the spending data of the agencies they fund. A taxpayer interested in viewing the companies and organizations that receive federal grants and contract awards would need to have a license for the proprietary Data Universal Numbering System (DUNS). Dun & Bradstreet Inc., the company that owns DUNS, functions as a monopoly with respect to government contractor data.

Continue Reading

Three years in, what does the DATA Act tell us about agency spending?

By: C. Jarrett Dieterle 

Trying to figure out exactly how much money the federal government spends long has been an exercise in futility for those few brave souls who endeavor to try it. Though the U.S. Treasury has published financial data since the beginning of the republic, the government has an uneven history, to say the least, when it comes to reporting agency expenditures.

Agencies traditionally have employed a hodgepodge of data and spending models that fail to adhere to a common metric. This makes it difficult for lawmakers and policy experts to wrap their arms fully around federal agency spending. Since at least the 1970s, efforts have been afoot to standardize government data, culminating in 2014’s Digital Accountability and Transparency Act, also known as the DATA Act.

The bill’s purpose was to make expenditures both more transparent and more accessible. It requires Treasury to establish common reporting standards across all federal agencies, with the data posted online in a publicly accessible format.

The DATA Act has been in the news again recently because the first agency reporting deadline is May 9, the third anniversary of the law’s passage. Right on cue, the DATA Coalition hosted a panel discussion and “hackathon” last week to let teams of data wonks work with some of the early datasets the agencies have provided.

Keynote speaker Rep. Jim Jordan, R-Ohio, emphasized the potential for uniform spending data to shape policy by helping lawmakers better understand the scope and size of government. That, in turn, could allow them to enact more meaningful reforms. As he put it: “If you don’t know where you are, it’s impossible to know where you’re going.”

The coalition also hosted a panel featuring three individuals who have been key to creating the uniform financial data standards the agencies now must use: Chistina Ho, deputy assistant Treasury secretary for accounting policy and financial transparency; Dave Zvenyach, executive director of General Services Administration’s 18F project; and Kristen Honey, senior policy adviser for the Office of Management and Budget’s chief information officer.

The panelists generally were optimistic about the implementation process, though each noted the difficulty involved in pursuing new endeavors within a convoluted bureaucracy like the federal government. Honey was sanguine about the potential for agencies to follow the lead of private industries that use open datasets for productive ends, noting that American taxpayers have “already paid for this data, so they should have access to it.”

She pointed to the example of the Department of Veterans Affairs’ synthetic dataset published last fall that will help them study mental health issues among military veterans. Honey also predicted that state and local governments were likely to follow suit on open data initiatives, which she hoped would help expose and weed out inefficiencies in government spending and operations across all levels of government.

The panelists also cautioned that many agencies likely will encounter difficulties aggregating and successfully publishing their spending data by the May 9 deadline. The concern was that if reports from the Government Accountability Office and agency inspectors general catalog widespread deficiencies around the first reporting deadline, it could lead the public and lawmakers to doubt the DATA Act’s efficacy.

James Madison famously claimed that the power of the purse was “the most complete and effectual weapon” that could be wielded by government. Increasing the standardization and transparency of government spending data will only help strengthen that power.

Congressmen reintroduce bill to make CRS reports public

By Jonathan Haggarty

The Government Publishing Office would be required to make Congressional Research Service reports publicly accessible over the internet, under legislation reintroduced last week by Reps. Leonard Lance, R-N.J., and Mike Quigley, D-Ill.

The CRS, a division of the Library of Congress, is known as Congress’ in-house “think tank.” House offices and committees historically have been free to publish CRS reports on their own websites for constituents to view and some third parties aggregate CRS data on websites like everyCRSreport.com.

But while taxpayers spend more than $100 million annually to fund CRS, timely access to these important documents is usually reserved to Washington insiders. There exists no official, aggregated source for taxpayers to access the CRS’ valuable and informative work.

R Street Vice President for Kevin Kosar, himself a veteran CRS analyst, testified recently before the House Legislative Branch Appropriations Subcommittee, where he presented the panel with a letter signed by 25 former CRS employees with more than 570 combined years of service who all support an open public database of nonconfidential CRS reports.

There is strong precedent for public access to legislative support agency documents. In his subcommittee testimony, Kevin noted the Government Accountability Office, Congressional Budget Office and the Law Library of Congress all make their reports public, as do the 85 percent of G-20 countries whose parliaments have subject-matter experts.

Proposals like the Lance-Quigley bill would place publishing responsibilities with another entity, to ameliorate CRS concerns about the service having to publish the reports itself. Briefings and confidential memoranda would not be disclosed and data issued to the public through a searchable, aggregated database would only include nonconfidential information.

As Kevin noted in his testimony, the public deserves to be on equal footing with lobbyists and the Hill.

Alex Pollock: Data Transparency and Multiple Perspectives

At Data Coalition‘s Financial Data Summit in March, Alex Pollock, distinguished senior fellow with the R Street Institute, former president and CEO of the Federal Home Loan Bank of Chicago, provided the plenary address. These are Mr. Pollock’s remarks as prepared for delivery.

One question underlying the very interesting data project and proposed legislation we are considering today is the relationship of data transparency to multiple perspectives on financial reality.  In a minute, we will take up the question: Of all the possible views of a statue, which is the true view?

But first, let me say what a pleasure it is to participate in these discussions of financial transparency; the new Financial Transparency Act, a bill introduced in Congress last night; data standardization; and of course greater efficiency— we are all for making reporting and compliance less costly.

Let me add to this list the separation of data and analysis, or what we may call the multiplication of perspectives on the financial object.  The potential separation of data and analysis may allow a richer and more varied analysis and deeper understanding, in addition to greater efficiency, in both government and business.

As the new white paper, “Standard Business Reporting,” by the Data Foundation and PricewaterhouseCoopers, says: “By eliminating documents and PDFs from their intake, and replacing document-based reporting with open data…agencies…gained the ability to deploy analytics without any translation.”  Further, that standardized data “will allow individuals to focus on analytics and spend time understanding the data.”

In historical contrast to these ideas, it is easy for me to remember when we couldn’t do anything like that.  When I was a young international banking officer working in Germany, one day 4,000 miles to the west, back in the Chicago headquarters, the head of the International Banking Department had lunch with the Chairman of the Board.  Picture the Chairman’s elegant private dining room, with china, silver and obsequious service.  In the course of the lunch, the Chairman asked, “For our large customers, can we see in one place all the credit exposure we have to them in different places around the world?”  Said the Executive Vice President, International Banking, “Of course we can!”

The next day, all over the world, junior people like me were busy with yellow pads and calculators, wildly working to add up all the credit exposure grouped into corporate families, so those papers could be sent to somebody else to aggregate further until ultimately they all were added up for the Chairman.

That was definitely not data independent of documents.  Imagine the high probability, or rather the certainty, of error in all those manually prepared pages.

A classic problem in the philosophical theory of knowledge turns out to be highly relevant to the issues of data transparency.  It is the difference between the real object, the “thing in itself,” and any particular representation or perspective on it.  In philosophical terms, the object is different from any particular perspective on it, but we can only perceive it, or think about it, or analyze it, from particular perspectives.

Likewise, a reporting document is a composite of the data—the thing—and some theory or perspective on the data which form the questions the report is designed to pursue and answer.

Let’s consider a famous type of report: GAAP financial statements.

Somewhere far underneath all the high level abstractions reflecting many accounting theories are the debits and credits, myriads of them doing a complex dance.

I think of my old, practical-minded instructor in Accounting 101.  This essential subject I studied in night school when I was a trainee in the bank.  I would ride the Chicago L train to my class, my feet freezing from the cold draft blowing under the doors.  But this lesson got burned into my mind: “If you don’t know what to debit and what to credit,” he said, “then you don’t understand the transaction from an accounting point of view.”  This has always seemed to me exactly right.

Later on, in this spirit, I used to enjoy saying to accountants advising me on some accounting theory: “Just tell me what you are going to debit and what you are going to credit.”  This usually surprised them!

I wonder how many of us here could even begin to pass my old accounting instructor’s test when considering, say, the consolidated financial statements of JPMorgan.  What would you debit and credit to produce those?  Of course we don’t know.

For JPMorgan, and everybody else, the debits and credits are turned into financial statements by a very large set of elaborate theories and imposed perspectives.  These are mandated by thousands of pages of Financial Accounting Standards pronounced by the Financial Accounting Standards Board.  Many of these binding interpretations are highly debatable and subject to strongly held, inconsistent views among equally knowledgeable experts.

Any large regulatory report has the same character: it is a compound of data and theory.

An articulate recent letter to the editor of the Wall Street Journal argues that “The CPA profession has made the accounting rules so convoluted that GAAP financials no longer tell you whether the company actually made money.”  This, the letter continues, is “why companies are increasingly reporting non-GAAP.  Investors are demanding this information. …Why should public companies not supply shareholders with the same metrics that the management uses?”  Why not, indeed?

In other words, why not have multiple interpretative perspectives on the same data, instead of only one?  This is a fine example of the difference between one perspective—GAAP—and other possibly insightful perspectives on the same financial object.  Why not have as many perspectives readily available as prove to be useful?

We are meeting today in Washington, DC, a city full of equestrian statues of winning Civil War generals.  (The losing side is naturally not represented.)  Think, for example, of the statues of General Grant or Sheridan or Sherman or Logan—all astride their steeds.  Perhaps you can picture these heroic statues in imagination.

I like to ask people to consider this question:  What is the true view of a statue?  Is it the one from the front, the top, the side (which side?), or what?  Every view is a true view, but each is partial.  Even the view of such an equestrian statue directly from behind—featuring the horse’s derriere—is one true view among others.  It is not the most attractive one, perhaps, but it may make you think of some people you know.

Likewise, what is the true view of a company, a bank, a government agency, a regulated activity, or a customer relationship?  Every document is one view.

Pondering this brings back a memory of my professor of 19th century German philosophy.  “The object,” he proposed, “is the sum of all possible perspectives on it.”

Similarly, we may say that a financial structure, or a policy problem, or an entity or an activity is the sum of many perspectives on it.  The ideal of open data available for multiple reports, analyses and purposes is a practical application of this metaphysical idea.

The ideal is not new.  In 1975, I went to London to work on a project to define all the elements—what were supposed to become the standardized data—for characterizing all the bank’s corporate customer relationships.  The computing technology expert leading the project convincingly explained how these data elements could then be combined and reordered into all the reports and analyses we would desire.

Then, as now, it was a great idea—but then it never actually happened.  It was before its time in technical demands.  But now I suspect the time has really come.  Fortunately, since then, we have had four decades of Moore’s Law operating, so that our information capacities are more than astronomically expanded.

So:

  • By freeing transparent, open data from being held captive in the dictated perspectives of thousands of reporting documents,
  • By saving data from being lost in the muddle of mutually inconsistent documents,
  • Can we provide transparent data, consistently defined, which will promote a wide variety of multiple perspectives to enrich our analysis and create new insights,
  • Not to mention making the process a lot cheaper?

This would be a great outcome of the project under consideration in our discussions today.

 

GovTrack testimony to the House of Representatives on public access to legislative information

Everything that our government does starts with an “appropriation” that sets a funding level for it. When Congress sets funding levels for the government as a whole, it also sets funding levels for itself to pay congressional staff, the Capitol police, to maintain the office buildings, and so on. (It’s about 0.1% of the total federal budget.)

On Wednesday we will be testifying before the House subcommittee with jurisdiction over Congress’s own funding limit — the House Committee on Appropriations Subcommittee on Legislative Branch — to talk about the importance of funding public access to legislative information so that we can bring that information to you.

You should be able to watch us here on Wednesday at 10:00 am Eastern.

Some of our testimony will be based on feedback we got on Twitter.

 We’ll be commending the committee for supporting public access to legislative information in recent years (background) and our requests mirror those that the Congressional Data Coalition (which we are a member of) has made over the years.

Our written testimony is pasted below:


Testimony for the Record: FY 2018 Legislative Branch Budget Request

Submitted by: Joshua Tauberer, Ph.D. President, Civic Impulse, LLC
To: House Committee on Appropriations Subcommittee on Legislative Branch
Regarding: Public Access to Legislative Information

April 28, 2017

Dear members of the subcommittee:

Each year ten million individuals use our free website www.GovTrack.us to research and track legislation in the U.S. Congress. Our users include journalists, legislative affairs professionals, legislative staff on the Hill, advocates, teachers, students, and of course member of the general public. This testimony is submitted on their behalf.

I would like to begin by commending the subcommittee for its support of important programs in the last several years that have allowed us to bring accurate and timely information to our users:

  • The House Bulk Data Taskforce’s legislative bulk data program, which went live in 2016 and was a joint effort of the Government Publishing Office, the Library of Congress, the Clerk of the House, and the Senate, has allowed us to disseminate the most accurate information yet about the status of pending legislation.
  • Several projects of the House Committee on House Administration including Docs.house.gov, publishing the United States Code in XML (with the Law Revision Counsel) [In my personal capacity I was a sub-contractor on this project.], improving the bill drafting process using XML (with the Office of Legislative Counsel), and the yearly Legislative Data and Transparency conference.
  • Improvements to the House Clerk’s website, including new member information.
  • The launch of Congress.gov by the Library of Congress, and its agile-lead improvements since its launch, which is an example for the whole legislative branch in how best to develop modern technology.
  • Digitization and publication of core historical documents by the Government Publishing Office and Library of Congress, including the Congressional Record, Statutes at Large, and Constitution Annotated (though more work is needed here).

Public access to legislative information remains an important need, and the subcommittee’s support for programs that provide such access ensures that accurate information reaches the American public — the tens of millions of Americans who include not only our users but also Americans who learn about their government by reading newspapers and magazines which rely on our service and Congress.gov for their research.

I also commend the staff at the House offices and legislative branch agencies named above who have done remarkable work in producing accurate, durable, and timely information within the constraints that an institution like the House of Representatives requires.

To continue the subcommittee’s commitment to public access to legislative information, I respectfully recommend the following:

  • Create a public advisory committee on legislative transparency for stakeholders to engage systematically on this issue, including but not limited to access to data.
  • Make the Bulk Data Taskforce permanent and fund the participation of the offices and agencies that are members of the taskforce.
  • Support congressional publication of other important information in a structured data format, including amendments, House committee votes, the Biographical Directory (Bioguide), and committee witness documents.
  • Continue to support efforts to modernize the House’s technology systems especially with respect to the work of committees and efforts to connect constituents to their representatives. Cultivate the legislative branch’s in-house technology talent as other parts of the government are doing.
  • Increase House staff levels above their current historic lows so the House has sufficient capacity for policy analysis and oversight and direct the Congressional Research Service to report on on how staffing levels impact the House’s capacity to function, and make that report public.
  • Systematically release the non-confidential Congressional Research Service reports to the general public. Years of experience has demonstrated that public access to these reports enhances the public debate without creating a commensurate burden.

I would be glad to discuss these topics further and tell you more about how the work of the House on public access to legislative information translates into a stronger democracy.

Yours,

Joshua Tauberer
President, Civic Impulse, LLC (GovTrack.us)

Originally published here.