Menu
Log in


 

Log in

Blog

  • May 16, 2019 9:00 AM | Anonymous member (Administrator)

    The following comments were adapted from the RegTech Data Summit Keynote Address of Ben Harris, Chief Economist, Results for America – Former Chief Economist and Economic Advisor to Vice President Joe Biden. Delivered April 23, 2019 in New York City.

    In a time of seemingly insurmountable partisanship, Congress was able to come together around the issue of evidence-based policy and pass the Foundations for Evidence-Based Policymaking Act (Evidence Act) that makes some dramatic changes to our ability to learn from data and evidence in our quest for better policy. As many of you may know, the OPEN Government Data Act—which was included in the Evidence Act—implements some remarkable changes, including:

    • Installing Chief Data Officers at all federal agencies
    • Documenting and coordinating the massive breath of data collected by agencies; and
    • Directing that non-sensitive government data be open by default.

    To start, it’s probably worthwhile to acknowledge that issues like data standardization and calls for access to open data might not be the sexiest topics, but we all can appreciate their importance.

    As a newcomer to this topic, I have only begun to understand and appreciate the need for widespread access to standardized, machine-readable data.

    The need for standardized and accessible data is crucial, but also the critical need to inject more evidence-based policy into our system of legislation and regulation.

    I have come to believe that our whole economy, not just government regulators, face a massive information deficit. Our economy, which now runs on data and information more than ever, still has gaping holes in the availability of information that undermines markets and can lead to widely inefficient outcomes.

    When it comes to evidence-based policymaking, our government has a long way to go. As pointed out in a 2013 op-ed in The Atlantic by Peter Orszag and John Bridgeland, only about one percent of our federal dollars are allocated based on evidence and evaluations. From my perspective, this is about ninety-nine percentage points too few.

    The lack of evaluation can inject massive and longstanding inefficacies into our federal, state, and city-level budgets resulting in wasteful spending and missed opportunities to improve lives. This is never more evident than in our country’s $1.5 trillion tax expenditure budget. We have hardly stopped to ask whether the $1.5 trillion spent annually in targeted tax breaks are achieving their desired objectives.

    The benefits of better evidence and data extend well-beyond direct spending and tax administration. It can mitigate the economic pain caused by a recession. Indeed, the severity of the financial crisis was exacerbated by the information deficit in the wake of Lehman’s collapse and the inevitable chaos that followed. Had financial firms and regulators been able to more accurately and quickly assess the extent of the damage through standardized financial data, we would have seen less radical actions by investors to withdraw from credit risk and more effective government intervention. Of all the factors that played a role in the crisis, I don’t think it’s hyperbole to say that lack of data standardization is perhaps the least appreciated.

    Evidence-based policy is also not just a matter of better government. It’s about people’s faith in government in the first place. Results for America recently commissioned a nationally representative survey about Americans’ attitudes about the role of evidence in policymaking. When asked about “what most drives policymakers’ decisions” a whopping forty-two percent said “boosting popularity, or getting votes” while thirty-four percent said it was the influence of lobbyists and just eight percent said it was evidence about what works. Surely these responses are cause for concern.

    Fortunately, there are solutions.

    To start, in a time when there are seemingly no bipartisan bills, we saw the passage of the Evidence Act—which is known to some as the umbrella bill for the OPEN Government Data Act. As I noted at the beginning, the Evidence Act represents a major step forward not just for the capacity of government agencies to implement evidence-based policy, but for the public to gain access to open, machine-readable data.

    Of course, this law is the beginning, not the end. We can help solve private market inefficiencies by calling for more data.

    • When it comes to better understanding the fees charged by financial advisers, the U.S. Securities and Exchange Commission (SEC) can amend Form-ADV to include explicit questions on fees charged. It’s that simple.
    • When it comes to evaluating government programs, I can think of no more powerful tool than providing federal agencies a 1 percent set-aside for evaluation. Results for America has called for this for years, and it’s time that Congress pick up the charge.
    • When it comes to evaluating the $1.5 trillion tax expenditure budget, we’ll have to make some institutional changes. One option is to expand the capacity of a federal entity, like the Internal Revenue Service (IRS) or the White House Office of Management and Budget (OMB), to include periodic evaluations of this budget. Another is to call for regular Congressional approval, similar to the process for appropriations.
    • And as we prepare for the possibility of the next recession, we also need to finish the progress made in earnest to make adoption of Legal Entity Identifiers (or LEIs) ubiquitous across the financial sector. While the progress since the great recession has been impressive, we have more work to do to ensure this system covers not only entities in the U.S., but our economic allies as well.

    These reforms can and should be viewed as steps to aid the private sector, hopefully leading to better economic outcomes, lessened regulatory burdens, or both.

    On the whole, I am clear-eyed about the challenges faced by advocates for evidence-based policy. The passage of the Evidence Act it is clear that progress can be made. To me, it feels like we are on the cusp of a new movement to incorporate data and evidence in all that government does. Together we can help ensure that policy does a better job of incorporating data and evidence, leading to improved lives for all Americans.


  • April 05, 2019 9:00 AM | Anonymous member (Administrator)

    In recent years, we have seen an explosion of regulatory technology, or “RegTech.” These solutions have the potential to transform the regulatory reporting process for the financial industry and the U.S. federal government. But RegTech can only thrive if government financial regulatory agencies, like the Securities and Exchange Commission (SEC), the Commodity Futures Trading Commission (CFTC), and the Federal Deposit Insurance Corporation (FDIC), adopt structured open data standards for the forms they collect from the private sector. We have seen changes and momentum for RegTech adoption is picking up, but there is much more to be done.

    At this year’s RegTech Data Summit on Tuesday, April 23, in New York, we’ll explore the intersection of regulatory reporting, emerging technology, and open data standards with financial regulators, industry leaders, RegTech experts, academics, and open data advocates.

    The Data Coalition has long advocated for RegTech policy reforms that make government regulatory reporting more efficient and less burdensome on both agencies and regulated entities. The benefits are clear. Unified data frameworks support efficient analytical systems; common, open data standards clear the path for more accurate market risk assessments among regulators and bolster transparency.

    The Summit comes at an opportune time. Federal financial regulators have already begun replacing document-based filings with open data standards.

    Within the past year, the SEC voted to mandate inline eXtensible Business Reporting Language (iXBRL) for corporate financial filings. The Federal Energy Regulatory Commission (FERC) proposed a rule change that would require a transition to XBRL from XML. The House of Representatives held the first-ever hearing on Standard Business Reporting (SBR). The Financial Stability Oversight Council (FSOC) reiterated its recommendation for the adoption of the Legal Entity Identifier (LEI) to improve data quality, oversight, and reporting efficiencies.

    There are also a number of international examples of government-wide adoption of open data that can serve as a guide for similar efforts in the U.S., which we will explore at our Summit. Standard Business Reporting (SBR), as successfully implemented by Australia, is still the gold standard for regulatory modernization efforts. By utilizing a standardized data structure to build SBR compliance solutions, the Australian government was able to streamline its reporting processes and save their government and private sector $1 billion AUD from 2015-2016.

    The success of SBR in Australia is undeniable. During our Summit, panelists will discuss how Congress is considering policy reforms that will enable the adoption of RegTech solutions to theoretically achieve the same savings as Australia. The Coalition has supported the Financial Transparency Act (FTA) since its introduction (H.R. 1530,115th Congress). The FTA directs the eight major U.S. financial regulatory agencies to collect and publish the information they collect from financial entities in an open data form, electronically searchable, downloadable in bulk, and without license restrictions.

    Once financial regulatory reporting is expressed as standardized, open data instead of disconnected documents, RegTech applications can republish, analyze, and automate reporting processes providing deeper insight and cutting costs.

    Entity identification systems are another pain point for the U.S. regulatory community. A recent Data Foundation report, jointly published with the Global Legal Entity Identifier Foundation (GLEIF), discovered that the U.S. federal government uses at least fifty distinct entity identification systems – all of which are separate and incompatible.

    If widely and properly implemented in the United States, a comprehensive entity identification system based on the LEI could help identify and mitigate risk in financial markets, track and debar low-performing federal contractors, improve supply chain efficiency, and generally be useful anywhere a government-to-business relationship exists. By working together, industry and government leaders can reap the benefits of these emerging RegTech solutions and open data applications.

    Karla McKenna, who is Head of Standards at GLEIF and specializes in international financial standards, and Matt Reed, Chief Counsel at the U.S. Treasury’s Office of Financial Research, are among the leading voices we will hear from at the Summit. Together with Ken Lamar, former Special Vice President at the Federal Reserve Bank of New York, and Robin Doyle, Managing Director at the Office of Regulatory Affairs at J.P. Morgan Chase, they will analyze the status of open standards and the impact a single entity identifier.

    We’ll be delving into RegTech applications like blockchain, analytic applications, and AI systems, as well as policies that will transform regulatory reporting like the FTA, and more at the second annual RegTech Data Summit on April 23. The Summit will convene financial regulators, industry leaders, academics, and open data advocates to discuss the latest innovations in regulatory technology and what the future holds.

    Summit-goers will have the opportunity to hear from SEC, Treasury, FDIC, and J.P. Morgan Chase representatives just to name a few. Featured speakers include former SEC Commissioner Troy Paredes; Dessa Glasser, Principal, The Financial Risk Group and formerly CDO, J.P. Morgan Asset Management; and Mark Montoya, Senior Business Analyst, FDIC.

    The Summit will focus on three main themes as we explore the future of U.S. regulatory reporting technology:

    • Enterprise Digitization: The modern enterprise faces a myriad of internal and external data challenges. By internally aligning common data formats and adopting open standards, financial institutions can build a competitive information foundation to more efficiently leverage emerging technology.
    • Open Data Standards: Adopting a single, open data standard for entity identification among U.S. regulatory agencies would create a framework for financial institutions and regulators to more accurately assess market risk, improve reporting efficiencies, lower transaction costs, and improve data quality.  
    • Reporting Modernization: By adopting open data standards, the U.S. government will be able to improve oversight and provide higher levels of accountability to citizens; facilitate data-driven analysis and decision making in agencies; and expand the use of automation, which will reduce compliance costs.

    It is clear that RegTech solutions will disrupt compliance norms by increasing efficiency, enhancing transparency, and driving analytics. However, successful implementation of this technology is only possible when government and industry focus on collecting, reporting and publishing quality and structured data. If you are eager to explore the future of compliance in which document-based regulatory reporting will become a thing of the past, then join us at the second annual RegTech Data SummitThe Intersection of Regulation, Data, and Technology.

    For more information on the Summit, check out our event webpage

  • March 08, 2019 9:00 AM | Anonymous member (Administrator)

    When President Trump signed the Foundations for Evidence-Based Policymaking (FEBP) Act (P.L. 115-435) in January, the Data Coalition celebrated a major milestone for open data legislation in the federal government. Title II of the law, the Open, Public, Electronic, and Necessary (OPEN) Government Data Act, is a transformative open data policy that modernizes the way the government collects, publishes, and uses non-sensitive public information. The law mandates that all non-sensitive government data assets be made available as open, machine-readable data under an open license by default. The Data Coalition advocated for this legislation for over three years and it is now law. So, what next?  

    The Data Coalition, Center for Data Innovation (CDI), and the American Library Association hosted a joint panel to discuss the OPEN Government Data Act’s impact on the future of open data in the United States. The Coalition’s own Senior Director of Policy, Christian Hoehner, as well as representatives from BSA | The Software Alliance, the Internet Association, SPARC, and the Bipartisan Policy Center, discussed what this new law means for government modernization, data-centric decision making, and the implementation of successful federal open data initiatives.

    Congressman Derek Kilmer (D-WA-6), an original sponsor of the OPEN Government Data Act and Chairman of the newly established House Select Committee on Congressional Modernization, provided opening remarks that touched upon the bills wide-reaching benefits as well as the future of open data policy and implementation. Congressman Kilmer touted the new law’s potential to create more economic opportunity for people in more places. Greater access to government data will allow Americans to start new businesses, create new jobs, and expand access to data and resources often concentrated in urban areas.

    The Congressman emphasized that the law passed with strong bipartisan support. He observed that the simple notion of giving taxpayers additional access to public data is beneficial for citizens.

    “Simply put, the OPEN Government Data Act gives the data the government collects to the people who pay for it, which is all of you,” Congressman Kilmer said during his remarks. “The bill passed because people across the increasingly wide political spectrum know that making access to government data…is a good idea.”

    Opening valuable government data assets will increase government accountability and transparency as well as technological innovation and investment. Under the law’s mandates, public government data will be made machine-readable by default without compromising security or intellectual property. This is a major victory for the open data movement, but as Congressman Kilmer acknowledged, this is certainly not the last step. There is still much to be done to ensure the law is successfully implemented.

    While the OPEN Government Data Act draws a lot of its mandates from already established federal orders – specifically President Obama’s 2013 “Open Data Policy-Managing Information as an Asset” (M-13-13) – the law adds additional weight and formalizes a number of open data best practices. The law legally establishes definitions for “open license,” “public data asset,” and “machine-readable,” which clarify the specific assets under scrutiny in this law. As Nick Shockey of the Scholarly Publishing and Academic Resources Coalition (SPARC) stated, the strong definitions behind open licensing will be great in strictly codifying data procedures around research and beyond.

    Establishing these definitions was critical as it reinforces another key part of this law: requiring the publication of public data assets as “open government data assets” on agencies’ data inventories. Under M-13-13, agencies tried to establish a comprehensive data inventory, but internal silos and other barriers prevented the publication of certain data sets.

    Nick Hart of the Bipartisan Policy Center drew from an example to explain the barriers. “The Census Bureau, for example, has some data that they received from other agencies. The other agencies may not have reported it on their inventory because they gave it to the Census Bureau, [and] the Census Bureau didn’t report it because it wasn’t their data.” Identifying the extent of government data assets has been a constant challenge for researchers and industry, but the open data inventories mandated in Title II will clarify exactly what public data assets the government has on hand.

    The open data inventories will contain all government data assets that aren’t open by default but would otherwise be available under the Freedom of Information Act (FIOA). “That’s going to, I think, make these data inventories a lot more robust and a lot more useful to researchers as they’re trying to identify what data an agency has that might not be currently made available through Data.gov," said Christian Troncoso, Policy Director at BSA | The Software Alliance.  

    Another important aspect of the OPEN Government Data Act discussed during the event was the establishment of an agency Chief Data Officer (CDO) Council. Every federal agency is now required to appoint a CDO, and these individuals will be tasked with implementing the components of the new law. One major challenge going forward will be how federal agencies establish and equip their CDO functions. That is, will they recognize that, as the law intends, the CDO function should be distinct and independently established apart from traditional information technology leadership functions. The hope is that CDOs will build communities of practice and work with their fellow members of the Council to share best data practices that could be implemented across agencies.

    In the end, CDOs will not just be Chief Information Officers under a different name. They will be the sentinels of quality, accurate and complete agency data and, hopefully, shift the culture to one of data management and data-driven decision making. As our Senior Director of Policy, Christian Hoehner said, better education about how data will be utilized by agencies and the public will help to incentivize agencies to ensure compliance with the law.

    The panel served as a celebration of how far open data has come and judging by the opinions of all panelists, the government is on track to continue expanding its open data initiatives. The Federal Data Strategy and President's Management Agenda show that the government recognizes the value of its vast stores of data. These initiatives will run parallel to the implementation of the OPEN Government Data Act, and in some cases, they will also intertwine.

    These executive efforts, including a recent Executive Order on U.S. artificial intelligence, and bipartisan congressional support show that open data utilization is a government priority, and our Coalition is pleased to see action by the Executive Branch, Congress, and agencies.

    While there is still plenty of work to be done once the law is officially enacted on July 8, 2019,  the passage and support of the OPEN Government Data Act is an indication that the government is moving closer to modernizing its processes and management using standardized, open data. The Data Coalition looks forward to working with the government on successfully implementing this transformative legislation.

    If you would like to watch the panel, “What’s Next for Open Data in the United States,” click here.


  • November 15, 2018 9:00 AM | Anonymous member (Administrator)

    Earlier this year, the Administration released their vision for the way our government can better collect and leverage data in four main categories:

    1. Enterprise Data Governance.
    2. Access, Use, and Augmentation.
    3. Decision Making & Accountability.
    4. Commercialization, Innovation, and Public Use.

    The Federal Data Strategy will define principles and practices for data management in the federal government. The Office of Management and Budget (OMB) is collaborating with the private sector, trade association, academia, and civil society to gather feedback and comments on the proposed strategy.

    The Data Coalition joined the Bipartisan Policy Center (BPC) and OMB to co-host a public forum discussing the Federal Data Strategy, last week (November 8th). The second in a series on the strategy, this forum allowed the public, businesses and other stakeholders to comment on the recently published draft set of practices. Data Coalition members DeepBD, Elder Research, Morningstar, SAP, Tableau, and Xcential as well as Data Foundation supporter companies Kearney and Company and REI Systems provided feedback on the strategy of the proposed practices. In their comments, members emphasized the value of federal data as applied in analytics, a standards-based modeling path, and the use of machine-readable forms that would create a better link between government services and citizens.

    Most commenters acknowledged that the Federal Data Strategy strategy represents an effort to initiate much-needed changes to the cultures around data across agencies and offered ways to improve the practices and implementation. Some attendees emphasized the need for greater clarity on the draft practices and provided examples of how the government can maximize the value of its data. Clarity and direction, they argued, would help move the strategy from an idea to a potential set of actionable steps for cultural changes.

    Better data utilization and management were was noted as a key to the success of the strategy. The Digital Accountability and Transparency Act (DATA Act) has significantly increased the quality of the data reported to the government. Our members who provided public statements were quick to bring attention to these improvements and how the DATA Act set the groundwork to fortify potential efforts to reach CAP Goals 2 (Data as a Strategic Asset) and 8 (Results-Oriented Accountability for Grants Management).

    According to Sherry Weir of Kearney & Company, if OMB starts with a budget request in the same standardized data format, the U.S. Treasury and agencies could then merge DATA Act data reported information (USAspending.gov) with federal appropriations bills. This connection is only possible with intelligent data stewardship, but it has the ability to connect otherwise disparate datasets across the budget lifecycle and provide insights that can motivate better more informed federal spending and policymaking.

    Throughout the day, a few commenters expressed concern over the complexity of the current draft strategy. They pointed out that the strategy, laid out across forty-seven practices and organized into ten principles, is too unwieldy for executive decision makers to readily articulate across their organization’s. The MITRE Corporation suggested that the strategy could be cut down to a single-page reference document and provided an example.

    It would be no simple task to distill the strategy. Panelists suggested that the Federal Data Strategy Team looks for small wins in data modernization efforts to build momentum on the larger goals.

    Larger conclusions presented by commenters included a view that the strategy fails if public servants cannot work across agency data silos to make better, data-driven management decisions that best serve the public.

    Stewardship is key to the success of the Federal Data Strategy, and the Administration needs sustained leadership to guide it in order to create to most value out its vast stores of data. With the feedback of all these industry leaders, advocates, and data experts, OMB is now tasked with using the public perspective to build a data strategy that facilitates efficient government data management.

    The Data Coalition was thrilled to partner with BPC and OMB on this important forum. Audio recordings of the forum are available online as well as social media coverage of the event. As a reminder for interested parties, public comments on the updated Federal Data Strategy are due by Friday, November 23. Comments can be submitted online in various forms. The Data Coalition will be providing its own written comments on the Federal Data Strategy that will we hope the Administration strongly considered when forming the final strategy.


  • September 07, 2018 9:00 AM | Anonymous member (Administrator)

    Modernizing financial regulatory reporting is no easy task. Regulators and regulated entities continue to rely on outdated technology and reporting systems. Open data standards are the key: by replacing document-based reports with standardized data, regulators can spur modernization.

    Data companies behind the Data Coalition have the solutions to make financial regulation more efficient and transparent for both regulatory agencies and the industry. Standardized data can be instantly analyzed, updated, and automated.

    On September 12, 2018, we brought together current and former Treasury and SEC officials, global reformers, and industry experts to explore the ongoing shift in financial regulatory reporting from documents to data — and its profound benefits for both regulators and regulated.

    Modernizing Financial Regulatory Reporting: New Opportunities for Data and RegTech” was an opportunity for attendees to better understand how data standards will enable new RegTech and blockchain applications for the financial industry at our first-ever New York City event. The event informed and encouraged key stakeholders to move forward with financial regulatory reporting reforms.

    The half-day gathering at Thomson Reuters headquarters in Times Square highlighted the modernization initiatives already underway and looked to the future. Here is a glimpse of what attendees learned throughout the event:

    Open data standards are becoming the norm – Thomson Reuters is leading the charge, Washington is making moves

    Open PermID Linked Data Graph. Source: https://permid.org/

    Thomson Reuters has moved away from the traditional model of charging for a standard, signaling the growth of the open data ecosystem. Rather than selling the standard itself, financial services leaders view the data and analysis on financial entities and instruments as a value-add service for clients. Thomson Reuters developed the Open PermID, which exemplifies open data developments in financial services, and is in line with the open-data movement happening in Washington.

    The PermID has created an efficient, transparent, and systematic market. As Thomson Reuters  states:

    The ID code is a machine-readable identifier that provides a unique reference for data item. Unlike most identifiers, PermID provides comprehensive identification across a wide variety of entity types including organizations, instruments, funds, issuers, and people. PermID never changes and is unambiguous, making it ideal as a reference identifier. Thomson Reuters has been using PermID in the center of our own information model and knowledge graph for over seven years.

    Open data standards like PermID, used for financial instruments, and the Legal Entity Identifier (LEI), can provide meaningful changes in the way financial data is collected and synthesized. Both the PermID and LEI are examples of how scalable open standards can improve internal efficiency and external transparency for financial regulatory reportingWhile the private sector develops viable use cases, policymakers in Washington are taking action to drive modernization in financial regulatory reporting.

    State of Financial Regulatory Reporting

    In Washington, three key policy milestones have occurred over the past 18 months, which demonstrates that agency officials, the Administration, and Congress are driving modernization.

    July 16, 2018: House Financial Services Committee ultimately did not include an anti-SEC data measure in their House passed JOBS & Investor Confidence Act  — a package of thirty-two bipartisan job creation bills. The Small Company Disclosure Simplification Act (H.R. 5054) was ultimately not included in the compromised JOBS Act 3.0 package remains a controversial measure lacking broad support in the Committee.

    June 29, 2018: The SEC voted to adopt Inline XBRL for corporate financial data disclosure (see the final rule). The move to Inline XBRL will end duplicative documents-plus-data financial reporting and transition to data-centric reporting. This initiative is a part of a broader modernization of the SEC’s entire disclosure system. The Data Coalition and its member companies (see comments from Workiva, Deloitte, Morningstar, and Grant Thornton) have long been supporting the adoption of Inline XBRL a the SEC. The Coalition’s comment letter further explains our support of the SEC’s decision to adopt the use of iXBRL (see our comment letter).

    June 13, 2017: Treasury Secretary Steven Mnuchin testified before the House Appropriations Committee in defense of the Treasury Department’s Fiscal Year (FY) 2018 Budget request. Mnuchin’s testimony showed an opening to standardize data fields and formats across the nation’s overlapping financial regulatory regimes – just as the Data Coalition has already been recommending to Congress.

    Regulators need standards for accuracy, analysis, and fraud reduction

    Financial regulators rely heavily, in some cases solely, on corporate filings and data analytics to detect fraud schemes – including the practice colloquially known as ‘Pumping and Dumping’. This scheme attempts to inflate the price of a stock through recommendations based on false, misleading, or greatly exaggerated, statements. Corporate financial filings are essential to accurately identifying and extracting fraudulent activities, ultimately protecting investors.

    If financial regulators adopted data standards across all reporting systems, it would make identifying fraud far easier. That is why our Coalition is working to persuade Congress to pass the Financial Transparency Act (H.R. 1530) (summary here). The FTA would require the eight major financial regulators to adopt common data standards for the information they collect from the private sector. The bill has gained thirty-two bipartisan cosponsors. When passed, it will be the nation’s first RegTech law.

    Most notable “pump and dump” scheme include: ZZZZ Best Inc., Centennial Technologies and Satyam Computer Services.

    Why the financial industry should embrace open data standards

    During the final panel of the day, attendees heard financial industry experts describe why their colleagues should get behind open standards, and why the financial industry should welcome regulatory action.

    Currently, financial entities rely on mostly manual reporting processes as they send information to government regulators. Under this outdated system, companies typically have one group who is responsible for preparing and another who is responsible for issuing the reporting. For larger companies, error is nearly inevitable in such a structure that is heavily reliant on layers of human oversight.

    Data standardization means lower compliance costs for the financial industry, more efficient enforcement for regulators, and better transparency for investors – but only if regulators work together to modernize.


  • August 20, 2018 9:00 AM | Anonymous member (Administrator)

    Guest blog by Robin Doyle, Managing Director, Office of Regulatory Affairs, J.P. Morgan Chase & Co.

    In May 2018, J.P. Morgan Chase published an article on the topic of data standardization, "Data Standardization - A Call to Action." The article called for the financial services industry, global regulators, and other stakeholders to make progress on addressing current deficiencies in financial data and reporting standards that would enhance the usability of the financial data for informational needs and risk management, as well as with innovative technologies like artificial intelligence and machine learning.

    The article called for both global and national regulators to review the state of data and reporting standardization, and to take action to make improvements within these areas. Within the United States, the need for such a review is urgent. The U.S. regulatory reporting framework is fragmented and there is a lack of coordination across agencies resulting in reporting requirements that are often duplicative and overlapping. Each agency is focused on collecting data in its own way, with its own definitions. This leads to higher cost for financial institutions to manage compliance and poorer quality/comparability of data for both regulators and firms.  

    Specifically, whenever new data or a report is requested with slight differences in definitions or granularity, it triggers a new reporting process, including reconciliations to other reports and U.S. GAAP numbers, as well as obtaining corresponding sign-offs and attestations. The lack of common data standards and reporting formats across the agencies make reporting complex and incomparable. Separate supervisory and examination processes occur as a result of the multi-agency, multi-reporting framework. Here are two areas that highlight the issues at hand:

    1. Top Counterparty Reporting: There are multiple reports that collect information about a firm’s top counterparties including the Office of the Comptroller of the Currency (OCC) Legal Lending Limit Report, Financial Stability Board (FSB) common data collection of Institution-to-Institution Credit Exposure Data (e.g., Top 50 Counterparty Report), Federal Financial Institutions Examination Council (FFIEC) 031 (i.e., bank “Call Report”), new Fed Single Counterparty Credit Limit Top 50 Report, and others. Each of these reports has a slightly different scope and use different definitions for aggregation of exposures resulting in significant work to produce the overlapping reports and explain the differences in the reported results.1

    2. Financial Institutions Classification: There are numerous reporting requirements – regulatory capital deductions (e.g., FFIEC 101 Schedule A & FR Y-9C, Schedule HC-R), risk-weighted assets (e.g., Asset Value Correlation calculation FFIEC 101 Schedule B), systemic risk (e.g., Federal Reserve’s Banking Organization Systemic Risk Report—FR Y-15 Schedule B), and liquidity (e.g., Complex Institution Liquidity Monitoring Report FR 2052a), among others – that aggregate and report data with the classification “Financial Institution,” each using a different definition of “Financial Institution.” While on the surface this may not seem complicated, the reality is firms have full teams of people who parse data across these different definitions to ensure reporting is done correctly and can be reconciled. In a large firm, efforts to create tagging systems to automate the parsing process can take years and multiple, additional headcount to implement.2

    The U.S. regulatory community is aware of this – in the commentary from the recent Y-15 information collection rule, the Federal Reserve acknowledges the conflict but does not address the burden:

    One commenter noted that the definition of ‘financial institution’ in the FR Y-15 is different from other regulatory reports and recommended aligning the varying definitions. In response, the Board acknowledges that its regulations and reporting sometimes use differing definitions for similar concepts and that this may require firms to track differences among the definitions. Firms should review the definition of ‘financial institution’ in the instructions of the form on which they are reporting and should not look to similar definitions in other forms as dispositive for appropriate reporting on the FR Y-15.

    These issues could be addressed through the use of common data and reporting standards across the agencies. The Financial Stability Oversight Council (FSOC) could take steps within its mandate to facilitate coordination among its member agencies towards the standardization of regulatory reporting requirements across the agencies.3

    The FSOC could initiate a review of the current state of data and reporting within the U.S. to identify overlapping and duplicative reporting requirements and opportunities to move from proprietary data standards to national and global standards. Based on the review, a roadmap could be established to address the issues and gaps identified. Innovative approaches to data collection, such as using single collections, could be established that are then shared among agencies and global reference data should be implemented in all cases where it exists. Further, mechanisms could be created to ensure better coordination among agencies in the process of rulemaking to avoid duplication and to leverage consistent, established data standards.

    The benefits of such improvements would be substantial. Better standardization of regulatory reporting requirements across the agencies would significantly improve the ability of the U.S. public sector to understand and identify the buildup of risk across financial products, institutions, and processes.

    Reducing duplication, streamlining reporting, and using data standards would lead to efficiency, saving time, and reducing costs that firms and regulators otherwise expend manually collecting, reconciling, and consolidating data. According to the U.S. Department of the Treasury’s Office of Financial Research (OFR), the estimated cost to the global industry from the lack of data uniformity and common standards runs into the billions of dollars.4

    Looking forward, having good quality, standardized data is an important stepping stone to reaping the benefits of the ongoing digitization of financial assets, digitization of markets and growing use of new, cutting-edge technologies, such as artificial intelligence. Many areas of the financial industry will be impacted, in some capacity, by these innovations in the coming years. These areas may include customer service, investment advice, contracts, compliance, anti-money laundering and fraud detection.

    We urge the U.S. regulatory community to heed this call to action.


    1. Links to report references: https://occ.gov/topics/credit/commercial-credit/lending-limits.html; https://www.fsb.org/policy_area/data-gaps/page/3/; https://www.fsb.org/wp-content/uploads/r_140506.pdf; https://www.newyorkfed.org/banking/reportingforms/FFIEC_031.html; https://www.federalreserve.gov/reportforms/formsreview/FR2590_20180620_f_draft.pdf)
    2. Links to referenced reports: https://www.ffiec.gov/forms101.htm; https://www.federalreserve.gov/reportforms/forms/FR_Y-1520170331_i.pdf; https://www.federalreserve.gov/reportforms/forms/FR_2052a20161231_f.pdf 
    3. Dodd-Frank Wall Street Reform and Consumer Protection Act, Sec. 112 (a)(2)(E)
    4. Office of Financial Research: Breaking Through Barriers Impeding Financial Data Standards (February, 2017).


  • August 07, 2018 9:00 AM | Anonymous member (Administrator)

    Last week, the Data Coalition responded to the newly released Federal Data Strategy which we summarized in a blog two weeks ago.

    The Federal Data Strategy is an outgrowth of the President’s Management Agenda, specifically Cross Agency Priority Goal #2 – Leveraging Data as a Strategic Asset, which is co-lead by the Office of Management and Budget (OMB), the Office of Science and Technology Policy (OSTP), the Department of Commerce (DOC), and the Small Business Administration (SBA). Administration officials within these agencies called for public feedback and are currently working through the responses. We expect to see more detailed plans between October and January 2019 (see page 11 of the recent action plan).

    Our response provided high-level commentary on the draft principles as well as six proposed use cases that the Administration could potentially work into the prospective Data Incubator Project.


    Commentary on Draft Principles:

    The Federal Data Strategy proposes 10 principles spread across three categories: StewardshipQuality, and Continuous Improvement.

    Overall, we emphasized the benefits of assuring federal data assets are in open and machine-readable formats that impose uniform and semantic structure on data, thus mitigating organizational uncertainties and expediting user development. We also discussed the importance of pursuing data standardization projects that identify common data elements across organizations and enforce standards.

    For Stewardship, it is important to ensure that data owners and end users are connected in ways that assure data is presented in useful ways and that data quality can be continuously improved.

    With regards to Quality, it is important to establish policies to assure core ‘operational’ and ‘programmatic’ data assets are accurate, consistent, and controlled. We note simply that it is at the point of ingestion that any data standards or quality thresholds should be enforced. As a starting place for data strategy principles, we recommend incorporating the open data the principles identified by the CIO Council’s Project Open Data.

    And finally, for Continuous Improvement, we recommend that data should be made available in open formats for bulk download by default. This allows for maximum stakeholder engagement from the beginning.


    Six Proposed Open Data Use Cases:

    We also propose the following six use cases for the Administration to work on:

    Use Case 1: Fix Public Company Filings (Access, use, and augmentation)

    The Securities and Exchange Commission (SEC) requires public companies to file financial statements in standardized XBRL format, but the standard has complications. Currently, the format allows for too much custom tagging, inhibiting the goals of comparability and transparency. The Administration should work with the SEC and the Financial Accounting Standards Board (FASB) to ensure that the U.S. Generally Accepted Accounting Principles (US GAAP) taxonomy enforces FASB rules as the true reference for all elements in the taxonomy, thus eliminating unnecessary tags, reducing overall complexity, and minimize the creation of extension data elements. This will ultimately improve comparability and data quality.

    Use Case 2: Documents to Data in Management Memorandum (Decision-making and Accountability)

    Congress has already taken on the challenge of adopting a data standard for laws and mandates via the United States Legislative Markup (USLM), which provides a framework for how the Administration can transform federal documents into open data. The Administration should publish federal management guidance in integrated, machine-readable data formats instead of documents. This will allow agencies to better understand how policies integrate with each other and thus work to comply more readily, and allow the public and Congress to better understand the specific factors guiding and constraining agency programs and leadership.

    Use Case 3: Entity Identification Working Group (Enterprise Data Governance)

    Currently, the federal government uses a variety of different codes to identify companies, nonprofits, and other non-federal entities, which makes matching data sets across federal agencies a time-consuming and expensive undertaking. Adoption of the Legal Entity Identifier (LEI) as the default identification code for legal entities will enable agencies to aggregate, compare, and match data sets critical to their regulatory and programmatic missions.

    Use Case 4: Mission Support or Operational Data Standards Coordination (Decision-Making and Accountability)

    Treasury and the Office of Management and Budget (OMB) have spent over four years working to establish and integrate the DATA Act Information Model Schema (DAIMS), which links budget, accounting, procurement, and financial assistance datasets – operational data – that were previously segmented across federal agencies. The Administration should utilize the DAIMS for modernizing the annual budget process, agency financial reporting, and agency performance reporting, thus allowing for easy use of data to compare, justify, and plan budget goals and agency spending.

    Use Case 5: Mission or Programmatic Data Standards Coordination (Enterprise Data Governance; Decision-Making and Accountability; Access, Use, and Augmentation)

    To build a common approach to multi-agency programmatic data sharing, the Departments of Homeland Security and Health and Human Services created the National Information Exchange Model (NIEM), which maintains a data dictionary of common fields allowing agencies to create formats using those fields. The Administration should consider endorsing NIEM as the government-wide default for programmatic data standardization and publication projects. This will afford agencies the easier path of reusing common data fields of the NIEM Core, rather than building their own data exchanges and reconciliation processes.

    Use Case 6: Establish A Standard Business Reporting Task Force to Standardize Regulatory Compliance (Enterprise Data Governance; Access, Use, and Augmentation)

    Standard Business Reporting (SBR), which has been fully implemented in Australia, demonstrates that regulatory agencies can reduce the compliance burden on the private sector by replacing duplicative forms with standardized data, governed by common data standards across multiple regimes. The Administration should convene a task force representing all major U.S. regulatory agencies to create a roadmap for standardizing the data fields and formats that they use to collect information from the private sector. While the full implementation of a U.S. SBR program would require a multi-year effort, the creation of an exploratory task force would put the policy barriers and necessary investments into scope.

    Other Organization’s Feedback Echo an Open Data Approach

    While the responses have not yet been made public in a central portal, we have gathered a few of the key submissions.

    The Bipartisan Policy Center (BPC) has issued two separate comment letters. The first letter, on behalf of the former leadership of the Commission on Evidence-Based Policymaking, summarizes the Commission’s recommendations. Their second letter summarizes recommendations made by the BPC coordinated Federal Data Working Group, which the Data Coalition works with. Here we have joined a call to clarify the guidance from the 2013 open data Executive Order (M-13-13) (e.g., define “data asset” and renew the focus on data inventories), leverage NIEM to develop data standards, look into harmonizing entity identifiers across agencies, explore preemptive implementation of the Foundations for Evidence-Based Policymaking Act (H.R. 4174), which includes the OPEN Government Data Act, and to define terminology for types of public sector data (i.e., similar to our comment’s demarcation between operational and programmatic data).    

    The Center for Data Innovation (CDI) think tank also provided feedback that calls for the administration to support the passage of the OPEN Government Data Act as “the single most effective step” the administration could take to achieve the goals of the Federal Data Strategy. Additionally, CDI calls for improvements to data.gov’s metadata, for OMB to establish an “Open Data Review Board” for incorporating public input in prioritizing open data projects, and for the Administration to establish “data trusts” to facilitate sharing of non-public data. Lastly, they make the point to consider how the Internet of Things (IoT) revolution and Artificial Intelligence (AI) should be included in the conversation.

    The data standards organization XBRL-US recommends that the Administration “require a single data standard for all financial data reporting…to establish a single data collection process,” adopt the Legal Entity Identifier for all entities reporting to the federal government and use automated validation rules to ensure data quality at the point of submission.

    The new State CDO Network sent a letter emphasizing the important role of State and local governments. They wrote,“[States are] in the unique position of creating and stewarding data based on federal requirements,” while calling for a formal plan to leverage administrative data to address fraud, waste, and abuse.

    The Preservation of Electronic Government Information (PEGI) Project calls for an advisory board to make recommendations on data management and stewardship while echoing our call to utilize the open government data principles and also incorporate the FAIR lifecycle data management principles. PEGI also calls for scalable and automated processes for maximizing the release of non-sensitive data on data.gov.

    Lastly, the American Medical Informatics Association (AMIA) identifies the publication and the harmonization of data dictionaries across agencies as two fundamental activities. They also call for collecting and creating information in ways that support “downstream information processing and dissemination,” establish a framework to help agencies implement a “portfolio approach” to data asset management, and for the Administration to extend the concept of “data as an asset” to information produced by federal grant recipients and contractors.

    The Data Coalition will be working with these groups and others to align the Administration’s efforts to establish a pragmatic, sustainable, and effective Federal Data Strategy.


  • March 30, 2018 9:00 AM | Anonymous member (Administrator)

    The inaugural RegTech Data Summit’s thesis was that regulatory rules, technology, and data must be modernized in a coordinated fashion. If all three areas are modernized in tandem, new RegTech solutions will flourish, reducing reporting duplication, minimizing reporting errors, and enabling automation.

    When Regulation, Technology, and Data intersect – change happens.

    Over 400 participants, 37 speakers, three live technology demos, and over 20 exhibitors agreed: we were right.

    Throughout the day, we looked at the state of regulatory reporting regimes, solutions that exist, and what could be improved by modernization – “What is?” and “What if ?”

    Here are my top 10 moments:

    1. SEC: RegTech can “make everyone’s lives easier”
    2. XBRL has an image problem
    3. A vision for a universal, non-proprietary identifier
    4. Demo: Achieving an open law vision
    5. Demo: Blockchain for continuous audits
    6. Demo: SBR is happening down under; businesses and government are both saving
    7. Silicon Valley Keynote: Private-public collaboration is key
    8. A coming convergence of regulation, technology, and data
    9. Looking ahead: What does the future look like for RegTech?
    10. The numbers

    Let’s dive into each of these movements!

    1. SEC keynote: RegTech can “make everyone’s lives easier”

    From left to right: Yolanda Scott Weston, a principal at Booz Allen Hamilton and Michael Piwowar, Commissioner, SEC

    SEC Commissioner Michael Piwowar kicked off the RegTech Data Summit. The Commissioner outlined his definition of RegTech:

    “[C]overs the use of technology by regulators to fulfil their duties in a more thorough and efficient manner…. RegTech also refers to the use of technology by regulated entities to streamline their compliance efforts and reduce legal and regulatory costs. Most importantly, the term covers collaboration between private and public actors to take advantage of existing technologies to make everyone’s lives easier.”

    • Watch SEC Commissioner Piwowar’s keynote address.

    2. XBRL has an image problem

    From left to right: Leslie Seidman, Former Chair, Financial Accounting Standards Board and Giancarlo Pellizzari, Head, Banking Supervision Data Division, European Central Bank

    • What is? Currently, the SEC has a dual reporting regime–HTML filing and XBRL filing. That’s burdensome! For over three years, the Data Coalition has been advocating for the SEC to move away from this duplicative reporting system. Leslie Seidman, former FASB Chairman, noted that “XBRL is suffering from a image crisis… most companies view this as a burden that’s deriving them no benefit whatsoever.”
    • What if? Seidman went on to recommend how advocates of structured data should describe its benefits to corporate America,

    “[S]how [corporations] the extent of manual effort compared to an automated process using XBRL data–that alone by combing the processes… you will clearly be saving money because you would have one group who is responsible for preparing and issuing those financial report… Talk candidly and influentially about these potential risks and costs that exist now, and how the iXBRL solution will actually reduce their risk and cost.”

    • Our Coalition strongly supports the Financial Transparency Act (FTA) (R. 1530(summary here), currently pending in the U.S. House, to direct the SEC to replace its duplicative documents-plus-data system with a single submission, both human- and machine-readable. The bill has 32 bipartisan cosponsors. When passed, it will be the nation’s first RegTech law.

    3. A vision for a universal, non-proprietary identifier

    • What is? Regulatory agencies use (approximately) 18 different identifiers to track the entities they regulate; companies maintain internal IDs; and proprietary ID systems are fatally flawed. There is no ID to rule them all. The hodgepodge of identifiers impedes technological solutions, frustrates compilers and enforcers, and wastes everyone’s time.

    4. Demo: A vision for machine-readable regulation

    • What is? Member company Xcential demoed how Congress is moving away from static documents to adopt open data standards. Xcential is helping theClerk of the House and the Office of the Law Revision Counsel create and apply an open data format to legislative materials; the project is known as the S. House Modernization project. Xcential’s software can natively draft and amend bills using the XML-based U.S. Legislative Model. Other projects Xcential is working on include the U.K. Legislative Drafting, Amending, & Publishing Programme, which is publishing rules, regulations, orders, directives, proclamations, schemes, and by-laws (bye-laws) in open data formats, fully machine-readable!

    • What if? If laws, regulations and other legal materials were all published in open formats, machine-readable regulation will improve compliance, reduce human effort, and shrink compliance costs. Legislation to keep an eye on: The Searchable Legislation Act (SLA) (R. 5143); The Statutes at Large Modernization Act (SALMA) (H.R. 1729); The Establishing Digital Interactive Transparency (EDIT) Act (H.R. 842).

    5. Demo: Blockchain for continuous audit

     

     

    • What is? Audits are burdensome, costly, and time consuming, and provide only a once-a-year or once-a-quarter picture of an organization’s finances.
    • What if? Auditchain co-founder Jason Meyers and director of assurance and XBRL architecture Eric Cohen demoed a blockchain-based approach to audits that focuses on a continuous stream of an organization’s transactions, instead of annual or quarterly reports. Summit audience saw a fully integrated continuous audit and reporting ecosystem for traditional and decentralized enterprises. The audit data combines a standardized transaction model on the back end and XBRL on the front end for dynamic, customizable reports for stakeholders.

     

     

    6. Demo: SBR is happening down under; businesses and government are both saving

    • What is:

    • What if? Matt Vickers of Xero outlined what the benefits would be for the United States if Standard Business Reporting (SBR) was adopted, “the economy is 15 times larger, and the tax system and regulatory complexity is comparable. The steps that need to be taken to ensure SBR is successfully implemented include: “1. Developing a single taxonomy and 2. Engaging early with software companies.”
    • The Data Coalition continues to push for legislation to replace financial regulatory documents with open data and support a longer-term move towards SBR in the United States. Our sister organization, the Data Foundation, explained how this might work in a report last year, co-published with PwC: Standard Business Report: Open Data to Cut Compliance Costs.

    7. Silicon Valley Keynote: Private-public collaboration is key

    • Joe Lonsdale, co-founder of Palantir and OpenGov, delivered a comepling keynote address on how our government and Silicon Valley can partner to improve the way government collects and publishes regulatory information. Here’s a snippet:

    “It is possible for a team of enterpurers to very meaningfully impact government… I don’t think these things get fixed by insiders. It’s just not how the world ever works. It is always outsiders partnering with allies in the inside and figuring out how to adopt technology that’s going to upgrade all these processes.”

    • Joe announced the founding of a new startup, Esper, which will work with regulatory agencies to automate the rulemaking process. Watch his keynote address here!

    8. A coming convergence of regulation, technology, and data


    • What is? Francis Rose, host of Government Matters, moderated the Convergence Panel, which featured insights that panelists had learnt throughout the day, and brought the day’s theme together: regulation, technology, and data must be modernized in a coordinated fashion to enable RegTech solutions. Panelists agreed this is “no easy task.”
    • What if? Panelist Adam White of GMU said it best when he described what needs to happen for regulation, technology, and data to be modernized: “Agencies need to be brought together in a collaborative way … that would benefit immensely from standardized data and more transparency, allowing agencies to work on a common basis of facts across the board.”

    9. Looking ahead: What does the future look like for RegTech?

    • What is? More than 200 regulators on the local, state and federal levels have disparate systems. The regulators continue to collect document-based filings, rather than using existing RegTech solutions to collect the information as open, standardized data. And they continue to issue regulations as documents, rather than exploring machine-readable regulation.

    From left to right: Steven Balla, GMU, Jim Harper, Competitive Enterprise Institute (former), and Sarah Joy Hays, Data Coalition

    • What if? Steven Balla of GMU said THREE things need to happen to transform regulatory information collection and rulemaking from documents into data: 1. “Agency leaders need to position themselves and their organization as innovators. We can underestimate the importance of allies with agencies; 2. We need the relevant authorities in the Executive Branch to have the coordination function, specifically OMB’s Office of Information and Regulatory Affairs; 3. [and,] finally, leadership on Capitol Hill. There is nothing more effective than a law or budget to move organizational behavior.”
    • Jim Harper formerly of the Competitive Enterprise Institute got right to the point: “To get all agencies on the same data standards there is on one hand, shame, and then there is political imperative.”

    10. The numbers


  • March 23, 2018 9:00 AM | Anonymous member (Administrator)

    This week the White House endorsed a data-centric approach to modernizing and restoring trust in government. For data companies and data transparency, the newly-unveiled President’s Management Agenda (PMA) does not disappoint.

    Where did this agenda come from?

    A year ago the White House issued an Executive Order, the Comprehensive Plan for Reorganizing the Executive Branch, and a corresponding government-wide reform plan (see M-17-22). Our prior blog on the reform plan implored the Administration to make operational and material data a central focus in modernizing government management.

    With the release of the PMA, that is what the White House has done.

    The PMA’s Fourteen Goals: A Performance Agenda Grounded in Law

    As a whole, the PMA should be read in concert with the President’s Fiscal Year 2019 budget request and the corresponding agency 2018-2022 strategic plans. However, thanks to the Government Performance and Results Act (GPRA) Modernization Act of 2010 (P.L. 111-352), which established Performance.gov, you do not need to laboriously parse out the individual goals from these reams of disconnected documents.

    Instead, the PMA is broken down into fourteen discrete Cross-Agency Priority (CAP) Goals, representing the GPRA Modernization Act’s requirement for the executive branch to “identify major management challenges that are Governmentwide or crosscutting in nature and describe plans to address such challenges.”

    The unique quality of these CAP Goals is that they are “long[-]term in nature.” In GPRA, Congress designed the concept of “agency priority goals” to span Presidential transitions. In the law, “cross-agency goals” are on a four year lifecycle with a requirement that they be established a full year after a new President takes office (see Sec. 5). We saw the benefits of this structure throughout 2017, as the previous Administration’s “Open Data” CAP Goal empowered agency leaders to keep pursuing data reforms through the first year of the new Administration’s transition (see the 2014-2018 goals archived here).

    Each CAP Goal names specific leaders who will be accountable for pursuing it. This accountability helps motivate progress and insulate from politics.

    Driving the PMA: “an integrated Data Strategy”

    With the PMA, the White House is putting data and data standards at the center of federal management. This matches our Coalition’s prior recommendations, and is good news for data companies and data transparency.

    The PMA identifies three overarching “drivers” of transformation: first, a focus on the government’s systems with IT Modernization (Goal 1: Modernize IT to Increase Productivity and Security); second, an integrated strategy around Data Accountability, and Transparency (Goal 2: Leveraging Data as a Strategic Asset); and third, improved Workforce management (Goal 3: Developing a Workforce for the 21st Century).

    These three drivers, and the three CAP goals that correspond to them, intersect with the PMA’s eleven other CAP goals (see image).

    The White House’s decision to clearly separate IT systems from data (and data standards) is the right approach. The government’s data can be standardized and made more useful, and more transparent, without requiring major system changes.

    Therefore, the Data Coalition applauds the PMA’s central focus on the government’s need for “a robust, integrated approach to using data to deliver on mission, serve customers, and steward resources”–a focus that will now guide this Administration.

    Last July we made three recommendations for the PMA along these lines. We are pleased to see all three recommendations reflected in the final product.

    First, we recommended that “OMB should adopt the DATA Act Information Model Schema (DAIMS) as the primary government-wide operational data format to align various agency business functions.” That’s exactly what Goal 2 of the PMA now does.

    The “Access, Use, and Augmentation” strategy for Goal 2 “will build on work like the DATA Act Information Model Schema (DAIMS)” (see page 16 of the PMA) and “promote interoperability, data standardization, and use of consensus standards, specifications, metadata, and consistent formats” (page 8 of the action plan). This syncs with the Treasury Department’s recently-released Strategic Plan, which states that ”[the DAIMS] can be expanded to include other administrative data and link more domains across the federal enterprise…to support decision-making and provide metrics for evaluating program performance and outcomes” (see page 30). The budget request backs this up with potential increased funding for the Treasury’s Bureau of the Fiscal Service which would have resources for “continued operational support for execution of the [DATA Act]” (see pages 21-22).

    Second, we recommended that the Administration leverage the work of the National Information Exchange Model (NIEM) for data governance work and information exchange across the government. If you read the PMA’s Goal 2 together with the 2019 budget request, you will find this recommendation validated as well.

    The “Enterprise Data Governance” strategy for Goal 2 calls for “develop[ing] a coordinated approach to managing communities of stakeholders in the Federal community and among external constituents” and better coordination of “existing governance bodies” (see page 7 of the action plan). Additionally, the 2019 budget request’s analytical perspective on “Building and Using Evidence to Improve Government Effectiveness” calls for the “development of interoperable data systems, which can communicate and exchange data with one another while maintaining the appropriate privacy and security protections” as “critical to realiz[ing] the full potential of shared administrative data.” The budget request goes on to praise NIEM as a model “data exchange at all levels of government across program areas…in partnership with private industry stakeholders and state/local partners” (see page 5 of the analytical perspective).

    Third, we supported the continued implementation of the Technology Business Model (TBM), a private-sector framework that helps organizations standardize data classifying technology investments, and recommended alignment with the DATA Act’s DAIMS.

    In the PMA, TBM is listed alongside the DAIMS in Goal 2 (see page 16 of the PMA) and names the DATA Act as a supporting program in Goal 10: Federal IT Spending Transparency (see page 10 of the action plan).  

    The PMA’s Other Goals: Grants Accountability, Paperless Forms, and (Maybe) Guidance Documents Expressed as Data

    Across the PMA’s other CAP Goals, we see a consistent data-centric approach and continued alignment with the Data Coalition’s Policy Agenda.

    As we celebrated yesterdayGoal 8: Results-Oriented Accountability for Grants “recognizes that recipient burden (such as excessive compliance requirements) can be reduced if grant reporting data is standardized” (see page 5 of the action plan). This aligns with the objectives of the Grant Reporting Efficiency and Agreements Transparency (GREAT) Act (H.R. 4887), that we are advocating for, and is making fast progress in Congress (see more).

    Goal 4: Improving Customer Experience introduces a “Paperless Government Project,” led by the US Digital Service, which would help agencies reduce redundant and unnecessarily complex forms. The Data Coalition is pushing reforms across a number of fronts that would apply open data concepts to simplify complex regulatory reporting (for instance, Standard Business Reporting).

    And Goal 6: Shifting From Low-Value to High-Value Work seeks to establish “regular processes to assess the burden [of OMB’s management guidance] on agencies and to rescind or modify requirements over time” (see page 5 of the action plan). The way to create such processes is for OMB to publish its guidance in integrated, machine-readable data formats instead of documents.3 Our work to pursue “open data for laws and mandates” provides a use case for exactly the same transformation, starting with Congressional laws, bills, and amendments.

    Each of the CAP Goals identify the senior executives who will be accountable for delivering  These promised reforms. We commend the administration for explicitly recognizing both the executives accountable for these goals as well as the career staff who will be managing these efforts over the next four years.

    The Road Ahead

    As all this work takes shape, it will be important to remember the guiding statements which set the stage at the PMA’s launch event. Newly-appointed Office of Management and Budget Deputy Director for Management Margaret Weichert called data “a foundational asset to driving economic growth in innovation.” Incoming US Chief Information Officer Suzette Kent echoed with a call for a “strategic view of data as one of our mission critical assets.” It will be up to these new leaders to turn the PMA’s vision into a reality.

    The Data Coalition will continue to support data transparency and data standardization–which means we will work hard hold the Administration accountable to these well-stated goals.


  • March 22, 2018 9:00 AM | Anonymous member (Administrator)

    The White House has published a plan to transform federal grant reporting from disconnected documents into open, standardized data.

    The Data Coalition views this as a big step forward! Supported by StreamLink Software other leading data companies, we’ve been pushing for open data in grant reporting since 2013.

    Last Tuesday, as part of the long-awaited release of the President's Management Agenda, the White House announced fourteen new government-wide goals. Goal number 8 of these is “Results Oriented Accountability for Grants.”

    The White House recognizes that the government’s current system of grant reporting creates challenges for grantor agencies, grantees, and the populations they serve. Grantees must fill out complicated document-based forms to report on their receipt and use of grant funds.

    As a result, their managers report spending 40% of their time on compliance, according to a survey by REI Systems, the National Grants Management Association, and the George Washington University.

    Meanwhile, because these forms are submitted to over 2,200 separate program offices across the government, transparency is difficult. There is no easy way for agencies, beneficiaries, or the public to see a grantee’s performance across multiple programs.

    CURRENT STATE: According to the Data Foundation’s Transforming Grant Reporting, without a standardized data taxonomy, federal grant reporting is a mess!

    Last year, our sister organization, the Data Foundation, conducted an intensive program of research into the challenges of federal grant reporting, supported by StreamLink Software and Workiva. In December, the Foundation published its magnum opus: Transforming Grant Reporting, which recommended that the government should “replace document-based [grant reporting] forms with standardized, open data.”

    To accomplish that, we need a government-wide taxonomy, or data dictionary, which standardizes the data fields that all grantor agencies use to collect information from their grantees.

    FUTURE: If the federal government adopts a common data taxonomy for all grant reporting–grantees will enjoy a reduced compliance burden and agencies and the public will get better transparency.

    Last month, Congress took note. Reps. Virginia Foxx (R-NC) and Jimmy Gomez (D-CA) introduced the GREAT Act (H.R. 4887), which will require the government to create the necessary taxonomy, and then require all the agencies to use electronic data, formatted consistently with that taxonomy, to collect information from their grantees. The House Oversight Committee unanimously passed the GREAT Act on February 6th, sending it to the full House of Representatives.

    Now, thanks to this week’s announcement, it’s clear that the White House is keen to take on the challenge of standardizing grant data, even in advance of a mandate from Congress.

    Here’s what the White House intends to do.

    First, working with the Department of Education and the Department of Health and Human Services, the White House will standardize the “Core Data Elements” that are used in reports that grantees submit to “a significant number of agencies.” This should be complete by the end of Fiscal Year 2018, or September 30, 2018. Details are on page 5 of the White House’s grants management Action Plan.

    Second, the White House will figure out how to govern and maintain the new taxonomy. The White House intends to complete this step by the same deadline: September 30, 2018.

    Third comes the hard part. The White House will “[D]evelop and execute [a] long-term plan for implementing data standards government-wide.” That means forcing all the grantor agencies to collect reports from their grantees in electronic data, formatted consistently with the taxonomy. The Action Plan announces no deadline for this crucial third step.

    Alongside these steps, the White House intends to create a common solution for Single Audit Reporting and build a tool to help agencies manage grant risk (page 6 of the Action Plan).

    Finally, once grant reports have been transformed into standardized data, and once new tools have been built to utilize that data, the White House will lead all grantor agencies to manage their grant programs based on risk (page 7 of the Action Plan).

    We are excited that the White House has put itself on a pathway to transforming all federal grant reporting.

    We won’t let our leaders off the hook, of course; we’ll still work to convince Congress to pass the GREAT Act right away, so that the transformation won’t just be a White House plan but a legal mandate.

    We know the road will be long. If the federal grant system were one company, it would be, by far, the world’s largest, with over $600 billion in annual revenue.

    But for the same reason, automating grantees’ compliance burden and bringing system-wide transparency for agencies and the public is too good an opportunity to miss.

    **Note: Read our full summary of the President’s Management Agenda here.**



1100 13th Street NW, Suite 800
Washington, DC 20005 USA
COALITION@DATAFOUNDATION.ORG

return to data foundation

Powered by Wild Apricot Membership Software