Blog
In the final weeks of 2019, Congress passed a significant piece of legislation, and the president signed the bipartisan bill into law Dec. 30, 2019: the Grant Reporting Efficiency and Agreements Transparency Act of 2019, or the GREAT Act, which will modernize federal grant reporting.
If your organization is one of the recipients or grantees of the more than $700 billion in annual financial assistance from the federal government, this will impact you. Chances are the GREAT Act will affect you and how your agency, institution, or research lab reports information back to the federal government.
We’ve been following the GREAT Act closely and are here to help answer your questions surrounding the legislation. Check out the Q&A of what you need to know now—and also what’s next for the GREAT Act, when the president signs it into law as expected.
Federal grants touch nearly every American through grant programs that provide free or reduced school lunches, equip our police and fire departments, build and maintain our highways and transportation infrastructure, support small businesses, and much, much more.
The GREAT Act is intended to:
“We named it the GREAT Act for a reason,” Rep. Virginia Foxx (R-N.C.) said when she introduced the bill in 2017. “The results of the passage will be great for stakeholders, government agencies, job creators, grantees, and grantors.”
It is designed to standardize a data structure for the information that recipients must report to federal agencies and to streamline the federal grant reporting process.
“Without updating the way we process grant reports, in many ways we might as well be still operating with a typewriter and a fax machine and Wite-Out,” Foxx said.
Based on my experience in implementing the DATA Act—the GREAT Act’s predecessor that requires financial data to be standardized, machine-readable, and transparent—here are three initial steps you can take to be proactive:
When taking these first steps, be sure to take an incremental approach. Let’s face it: change is hard. Do the easiest and smallest things first to make meaningful progress. Also, truly look at this as an opportunity to modernize and leverage the GREAT Act as a catalyst to push forward data and efficiency initiatives that may have stalled out in the past.
You probably won’t experience a great deal of change right away, as the act requires the White House Office of Management and Budget (OMB) and the designated “standard-setting agency”—likely the Department of Health and Human Services (HHS)—to develop data standards over the course of the next two years. Once the standards are set, federal grant-making agencies have one year to adopt the standards and issue guidance to recipients for all future information collection requests.
As mentioned above, you can take this time to implement new best practices that improve processes across the board, so you can not only be fully prepared for when the effective date arrives, but also take advantage of the new efficiencies in advance.
The law’s provisions require the new data standards be nonproprietary, machine-readable, and be in line with existing machine-readable data standards, including standards set under the Federal Funding Accountability and Transparency Act of 2006 used for USAspending.gov reporting.
If you are a grant recipient, what you report will likely not change substantially, but how you report will change in ways you might not have expected. More on those in the next section.
Some of the benefits to reporting organizations include a reduction in redundant reporting, potential for automated validation of a report’s completeness, automated delivery, and if you’re a high-performing organization, faster recognition of your excellent performance.
First benefit: Grant recipients may need to report less data. Recipients will be required to submit data in a machine-readable format, meaning that just sending PDFs will no longer be acceptable. Federal agencies will leverage technology to automate and streamline reporting processes, so recipients would report data only once, and federal agencies would use it multiple times.
Second benefit: Federal agencies can make more informed decisions when awarding grants. That’s a huge advantage if you are a recipient that consistently receives a clean audit opinion—chances are you will have greater access to federal grant dollars. The GREAT Act simplifies the current single audit reporting by requiring the gold mine of single audit data to be reported in an electronic format consistent with the data standards established under the act.
But if your single audit reports identify material weaknesses and repeat findings year after year, right now is a good time to get your house in order if you want to keep receiving federal funds.
Keep in mind this general timeline:
Our team at Workiva is committed to keeping you informed on the significant news related to the GREAT Act, whether you are a recipient, auditor, or federal grant-making agency. The Data Coalition, America’s premier voice on data policy, is another great resource for staying up-to-date. Check out the Data Coalition website, and subscribe to their newsletter.
Editor’s note: This blog has been updated and was originally published on December 20, 2019, at workiva.com.
In 2019, tremendous bipartisan efforts led to new federal data laws, progress on legislation that may become law in 2020, and gains inside the Executive Branch for implementing strategies to make government data more accessible and useable. Amidst a backdrop of hyper-partisanship and political turmoil in Washington, DC, the bipartisan approach for improving the federal government’s data policies cannot be understated.
Here are a few highlights from the past 12 months on areas of success for Data Coalition priorities:
Other Data Coalition priorities are also advancing as work proceeds to establish a federal data service, Congress considers the Taxpayer Right to Know Act developing program inventories, the National Security Commission on AI produces its final recommendations, the Advisory Committee on Data for Evidence Building is established and begins to deliberate, and as our country’s lawmakers consider reforms through the Select Committee on Modernization of Congress.
In short, the Data Coalition’s advocacy efforts paid off tremendously this year – and are continuing to accrue benefits – for the American people and our society. Through nearly 100 briefings on key policy priorities, the GovDATAx Summit, and countless other activities this year that bring together the corners of the data community, the Data Coalition’s members and expertise achieved real and lasting headway for the country.
In 2020 and beyond, as agencies work to implement the Evidence Act, the GREAT Act, and more, the Data Coalition members and staff will continue to serve as a resource to hold government accountable for effective implementation while also devising strategies for continuous improvement. The moment for more meaningfully transforming government data into a strategic asset is upon us and we hope the continued enthusiasm and support for the Data Coalition’s efforts will sustain this momentum in the coming years. More importantly, have confidence that in 2020 conversations about better using government data as an asset will continue to bring together Republicans and Democrats as we work to achieve common goals for improving society and meeting the needs of the American people.
The United States government is rapidly progressing in implementing new laws and guidance on evidence-based policymaking, data-driven government, and open data. During the Data Coalition’s GovDATAx Summit in 2019, Department of Commerce Deputy Secretary Karen Dunn Kelley suggested a metaphorical book on evidence-based policymaking for the U.S. is being written, with new chapters every year. In chapter 1 in 2017, the U.S. Commission on Evidence-Based Policymaking wrote a seminal report on better using government data. In Chapter 2 in 2018, Congress passed the Foundations for Evidence-Based Policymaking Act (Evidence Act), and in Chapter 3 the Executive Branch published its Federal Data Strategy.
What’s the next chapter?
Data Coalition members met with senior leadership at the Department of Commerce to discuss Chapter 4 leading into 2020 and beyond. In addition, groups like Project Evident, the Brookings Institution, and the University of Chicago’s Center for Impact Sciences are all developing plans for the next generation of evidence-based policymaking.
Key aspects and inputs for the next chapter will include:
Needless to say, there is much work to do to ensure our government increasingly adopts evidence-based and data-driven approaches.
What can those outside government do to support evidence-based policymaking?
As the next chapter of the evidence and data movement is written, stakeholders in non-profits, academia, and the private sector can all contribute. Action items could include:
As the federal government continues to improve data quality, access, and ease of use, the Data Coalition will continue to support its members in engaging in each of the action items as the next chapter is written, and beyond.
One of the largest data collection exercises undertaken by the United States government every 10 years is about to begin – the decennial census. In June 2019 the Data Coalition announced its formal partnership with the U.S. Census Bureau to help promote an accurate count, with support from the business community.
Data Coalition members met with U.S. Census Bureau Deputy Director Ron Jarmin to discuss the inflection points as the country prepares for the 2020 census, in addition to how the business community’s technology and data analytics firms can be more involved. Here are three key takeaways from the discussion:
#1: The benefits of a high-quality, accurate count are both clear and significant.
The country needs a high-quality census count to support a range of activities beyond allocating representation in Congress, from evidence-based policymaking in government to data-driven decision-making in the private sector. Decision-makers throughout government rely on census to allocate grant funds, determine the number of households in a region for assessing program needs, and also use the information to benchmark other surveys, such as the American Community Survey. Similarly, the business community relies on census data to determine where to locate new offices or retail establishments and to inform market insights.
An accurate count is also a difficult feat because some populations are especially difficult to reliably count. Populations include migrant communities, children under five, and homeless individuals, just to name a few. All of these populations are important to capture because as population dynamics shift, government and businesses must be able to respond accordingly to ensure public and private services are efficient, effective, and meet the expectations of the American people.
#2: Census privacy and confidentiality protections are strong.
While there has been much discourse over the past year about how certain data may be used in the contemporaneous environment to support decision-making, the Census Bureau data must be held in confidence.
The Census Bureau is only allowed to release summary statistics or information that does not include personal identifiers to the public. Any violation of this policy is also a violation of two separate laws – the Census Act and the Confidential Information Protection and Statistical Efficiency Act of 2018 – potentially carrying $250,000 in fines and up to 5 years imprisonment under each of the laws for each offense. Needless to say, the safeguards in law and throughout the Census Bureau’s professional staff are taken seriously and the American public should be assured that their confidential data will be strictly protected.
#3: Every organization can – and should – play a role in supporting an accurate count.
There are numerous tactics that businesses and non-profit organizations can use to support an accurate census count. But whether large or small, all organizations can play a role in supporting the census. Potential examples of activities could include promoting the census on websites, encouraging employees to respond to the census in 2020 with emails or even breakroom posters, all the way through targeted support services to meet particular needs.
Data Coalition member Esri published a resource in July 2019 explaining relevant methodological and technology tools for supporting the geospatial capabilities needed for the census. Another Data Coalition member, Tableau, is supporting the Census Bureau’s efforts to track response rates once the census begins, so that local community organizers can have efficient metrics to support their efforts. Deloitte Consulting offers a variety of IT and management support roles to encourage support efficient execution of the 2020 census. New member USAFacts is working to promote the new features of next year’s census. The Census Bureau continues to search for partners in the business community for the 2020 census.
Other data and technology partners are critical for supporting the census as social media and internet efforts have rapidly advanced over the past decade. The 2020 census will allow responses through the internet making responding easier than ever, but the risk of misinformation campaigns and the presence of cybersecurity threats are real. Technology and data companies can help support the census in reducing the risks for executing a census in the modern world.
The Data Coalition will continue to support the effort to ensure the 2020 census results in useful data for the American people.
Over the past two years, the prospect of the United States government and key decision-makers becoming more steeped in evidence-based policymaking has become increasingly bright.
On September 7, 2017, the U.S. Commission on Evidence-Based Policymaking (Evidence Commission) released its final report to the President and Congress with a strategy for better using the data that government already collects. The report contained 22 unanimous recommendations that focused on responsibly improving access to government data, strengthening privacy protections, and expanding the capacity to generate and use evidence.
Progress on Fulfilling the Commission’s Vision
While action has not yet been taken on all of the Evidence Commission’s recommendations, significant progress has occurred over the past two years. Here are some key highlights of what transpired in the last two years:
Next Steps on Fulfilling the Commission’s Vision
The Evidence Commission set the stage for monumental changes in government data laws, processes, and culture. Agencies have initiated wholesale overhauls of data management practices and the recommendations are quickly becoming reality.
But much work remains to fulfill the bipartisan vision outlined by the Evidence Commission – that government data are meaningfully analyzed to produce credible evidence that is actually used to inform policymaking. In the coming months and years, here are five areas for further attention:
Today, the Evidence Commission’s legacy can be celebrated as a substantial accomplishment in developing a sound and actionable strategy for better using government data. While more attention is needed to change government’s culture and operations to be more evidence-based, the early steps to better manage and use data are exceedingly promising.
Guest blog by Jane Wiseman, Institute for Excellence in Government.
As a fellow at the Harvard Kennedy School helping to support a national network of state chief data officers, I’ve had a front row seat to leading edge analytics for the past three years. As a more recent observer of chief data officers in the federal government, I’ve been impressed by the excellence and diversity of innovative data work being done by the pioneering data officers in federal service.
While reading the Federal Data Strategy Action Plan, I was inspired by how detailed and thoughtful it is. Actions address a range of important topics, and it’s wonderful to see data governance and data literacy getting attention, as these topics are not glamorous and sometimes get too little focus at the state and local level. There’s an aggressive timeline in the action plan and a remarkable amount has already been accomplished.
I was honored to share my thoughts at the Federal Data Forum hosted by the Data Coalition and the White House in early July, and was energized by the range of experts who shared their ideas. The simple fact that OMB is creating a data strategy for the federal government is one of the most exciting developments in data-driven government in my career and offers a tremendous opportunity to deliver more value for taxpayers.
Listening to other speakers during the forum, I was surprised twice – first at the common concern about the gap between the needed data literacy for government managers and current skill levels, and secondly by how few voices were calling for agencies to create and publish their data strategy.
Observation #1: The federal government needs to invest in data literacy.
Most leaders and decision-makers in government are not digital natives and many lack confidence in their data and digital skills. And this gap will slow the adoption of data-driven government.
Executives need to know how to ask for data analysis (what’s possible and what’s not), and how to critique results. They don’t need to be able to code or build an algorithm or map themselves, but they should know the power and capability of modern data tools. Basic data literacy means knowing how to ask good questions that inspire analysis, and then having the confidence to interpret and use the results. Let’s take a comparative study that was done between the US and 23 countries. In a comparison of the skills of the adult workforce in the US to those in 23 countries on "literacy, numeracy, and problem-solving skills," Japan and Finland led the pack on numeracy skills, while the United States ranked a disappointing 21 out of 23 participating countries. To close this achievement gap and to move toward large-scale organizational change to data-driven government will take a variety of training and coaching offerings. And it will take time.
Recommendation: The federal government should provide a full suite of data literacy training, support and coaching for senior leaders and management executives in government so that they have the confidence to lead data efforts.
Observation #2: Each agency should have their own data strategy and plan.
As the saying goes, “If we don’t know where we’re going we might end up somewhere else.”
In the Federal Data Strategy Action Plan, actions 12-16 call for parts of what would be agency-specific data strategies. But this call to action falls short of asking each agency head to publish a multi-year data strategy and to report annually on progress toward achieving that strategy. We need to know where each agency is going in order to achieve data-driven decision making at every level of the organization and across all bureaus and agencies. Without a long-term strategy, we can’t hope for permanent culture change.
The elements that exist in the action items — from data governance to open data to important data questions — need to be knit together into a multi-year plan for optimizing the use of data. Targets also must be set for each that can be reported on an annual basis in a cohesive integrated, department-wide plan.
With a clear charter from the chief executive, and armed with a strategy, a chief data officer can define a roadmap describing the difference the chief data officer team can make in government over a three- to five-year time horizon. The best chief data officers at federal, state and local levels operate from a strategy, a guiding document that sets mission and vision. They share their strategies and that helps them stay focused, and helps communicate to others what is and is not expected (e.g. chief data officers are not the one you call when you need someone to fix the printer and have an email server issue).
Agency heads must be the individuals ultimately responsible for their data strategy and must invest their chief data officers with the authority and resources to carry it out. It needs to be clear to all the chief executive has a strong relationship with and relies on data, and views the chief data officer as a trusted and important resource to the organization.
Strategy is about mission and outlining key questions. Asking good questions matters and clearly defining how the analytics research question will generate public value is important. The importance of understanding the connection to priority policy questions is summed up well by one of the CDOs I interviewed last year who said, “You might be Lord Algorithm but if you don’t stop to understand the problem, you will never succeed in making government better.”
When the UK created the Government Digital Service in 2011, their focus was on long-term transformation of government to digital services, and they’ve made consistent progress every year. But it didn’t all happen overnight. One of the keys was to make sure every agency had a point person and that he or she had a roadmap. We need that same level of focus on individual federal agency level strategies.
Recommendation: OMB should hold agency heads accountable on an annual basis for progress on achieving data-driven government, should require them to be the leader of their department’s multi-year strategic plan for data, and require the publication of their data strategy.
Successful Implementation Holds Promise for Lasting Impact
With this exciting momentum at the federal level toward becoming more data-driven in decision-making, there is a tremendous opportunity for the federal government to support the development of capacity in state and local government as well. For example:
The federal government has a unique opportunity at this moment to help incubate and advance the field with actions in every agency. Leadership and investment in capacity now will pay dividends long into the future.
The Federal Data Strategy, with an iterative and collaborative approach, should support agencies moving forward. If it’s done right, the strategy will lead to a major transformation of government.
Whether government should use data to improve society is no longer up for debate – the answer is definitively yes. When high-quality government information is accessible, it can be applied to generate insights that help decision-makers develop better policies for the American people. We’ve seen successes in early education, disease prevention, and government transparency efforts, and more could be done with better access to data.
For too long, our government operated in silos to address some of the issues related to data access and use, without planning for a wide range of data users. Too frequently, the data community reinforces its own silos rather than working in collaboration. That is why the Data Coalition is launching GovDATAx Summit in 2019.
Our inaugural GovDATAx Summit will be the beginning of a renewed dialogue about how we improve government’s data policies to truly unleash the power of data for the public good. The data community should work to empower innovators to generate insights that improve our economy and the quality of life for the American public. The conversation at GovDATAx is intended for an audience that:
During the Summit’s main sessions, experts from the White House, agency leadership, academia, and the private sector will discuss important new bipartisan policies that are being implemented this year, like the Evidence Act, which establishes new data leaders – chief data officers, evaluation officers, and statistical experts – across the federal government. GovDATAx will feature a discussion of important data standards that lead to better data quality, promote opportunities for public-private partnerships, and present exemplars about what works for using data as an asset.
Be a part of shaping the future of government data. Join the Data Coalition and hundreds of leaders from across the public sector, businesses, non-profits, and academia on Wednesday, October 30 to discuss the next steps for developing policies that unleash data for good.
The American public provides an incredible amount of information to the federal government – valued at $140 billion each year. This information is provided as part of businesses complying with regulations, individuals and firms paying taxes, and individuals applying for programs.
The development of the Executive Branch’s Federal Data Strategy is an effort to better organize and use all of that information to improve decision-making. On July 8, 2019, the Data Coalition joined the White House to co-sponsor a public forum gathering feedback on what actions the federal government will undertake over the next year to begin implementing the data strategy.
Kicking off the event, Dr. Kelvin Droegemeier, Director of the White House’s Office of Science and Technology Policy, stressed a key goal of the strategy is to “liberate” government data for society’s use, noting that nearly 90 percent of government data go unused. During the forum, 52 speakers – including 12 members of the Data Coalition – and more than 100 other experts provided input on how to improve the draft plan.
Here are fourtake-aways from the public comments provided during the forum:
#1. Leadership is Essential for Realizing Culture Change
New legislation enacted in early 2019 creates several new positions in government agencies, such as the chief data officers, evaluation officers, and statistical experts. Throughout the public forum, speakers stressed the need for these leaders to be empowered to institute changes within federal agencies, and informed about how to most effectively implement best practices for data access, management, and use.
Several speakers specifically stressed the critical role for newly established chief data officers in improving data quality and usefulness across government, in addition to providing improved training and tools for agencies to equip the federal workforce to use data. The concept of data literacy was also prominently featured, meaning that throughout federal agencies the workforce should be trained routinely about responsibilities and techniques for responsibly managing and using data.
#2. Targeted Data Standards Offer Opportunities for Efficiency
The need for improved data standards was discussed by speakers on more than half the panels during the event, with suggestions that the Federal Data Strategy could do more to encourage standards in the areas of financial regulatory reporting, agency spending and grant management, geospatial data, and organizational entities. For example, multiple speakers highlighted the potential opportunity to include more direction about the adoption of common business entity identifiers, like the globally-recognized Legal Entity Identifier (LEI) created, as a means of improving analytical capabilities while also reducing reporting burdens from regulated entities.
#3. Partnerships are an Important Element for Success
Many speakers noted their appreciation for the opportunity to provide feedback on the data strategy, and encouraged ongoing collaboration with those outside government through public-private partnerships. As the strategy is implemented over the next year, industry, non-profits, academics, and others in the American public should have opportunities to weigh in and hold agencies accountable for achieving the stated goals in the plan. Partnerships also offer agencies a specific means to coordinate with those outside of government to ensure the implemented policies and practices achieve meaningful improvements in the country’s data infrastructure.
#4. Coordination at OMB and Agencies is Key
Finally, because government data collected by one agency are often relevant for another, coordination is a critical component of success in the Federal Data Strategy. Speakers highlighted that the proposed OMB Data Council could serve as a model for agencies about how to work across interests, laws, and policy domains to achieve lasting change. But coordination must be achieved or promoted not just by OMB; it is also a responsibility for every agency to coordinate within its own programs and leaders to promote culture change, a data literate workforce, and to allocate resources to achieve the goals of the strategy.
In the coming months, the action plan will be finalized and publicly released, incorporating the comments from the Coalition-White House public forum along with other written feedback. The Data Coalition looks forward to continuing to partner with the federal government to ensure our national data policies truly make data an asset for the country.
To read the Data Coalition’s comments on the Strategy’s Draft Action Plan, click here.
The following comments were adapted from the RegTech Data Summit Keynote Address of Ben Harris, Chief Economist, Results for America – Former Chief Economist and Economic Advisor to Vice President Joe Biden. Delivered April 23, 2019 in New York City.
In a time of seemingly insurmountable partisanship, Congress was able to come together around the issue of evidence-based policy and pass the Foundations for Evidence-Based Policymaking Act (Evidence Act) that makes some dramatic changes to our ability to learn from data and evidence in our quest for better policy. As many of you may know, the OPEN Government Data Act—which was included in the Evidence Act—implements some remarkable changes, including:
To start, it’s probably worthwhile to acknowledge that issues like data standardization and calls for access to open data might not be the sexiest topics, but we all can appreciate their importance.
As a newcomer to this topic, I have only begun to understand and appreciate the need for widespread access to standardized, machine-readable data.
The need for standardized and accessible data is crucial, but also the critical need to inject more evidence-based policy into our system of legislation and regulation.
I have come to believe that our whole economy, not just government regulators, face a massive information deficit. Our economy, which now runs on data and information more than ever, still has gaping holes in the availability of information that undermines markets and can lead to widely inefficient outcomes.
When it comes to evidence-based policymaking, our government has a long way to go. As pointed out in a 2013 op-ed in The Atlantic by Peter Orszag and John Bridgeland, only about one percent of our federal dollars are allocated based on evidence and evaluations. From my perspective, this is about ninety-nine percentage points too few.
The lack of evaluation can inject massive and longstanding inefficacies into our federal, state, and city-level budgets resulting in wasteful spending and missed opportunities to improve lives. This is never more evident than in our country’s $1.5 trillion tax expenditure budget. We have hardly stopped to ask whether the $1.5 trillion spent annually in targeted tax breaks are achieving their desired objectives.
The benefits of better evidence and data extend well-beyond direct spending and tax administration. It can mitigate the economic pain caused by a recession. Indeed, the severity of the financial crisis was exacerbated by the information deficit in the wake of Lehman’s collapse and the inevitable chaos that followed. Had financial firms and regulators been able to more accurately and quickly assess the extent of the damage through standardized financial data, we would have seen less radical actions by investors to withdraw from credit risk and more effective government intervention. Of all the factors that played a role in the crisis, I don’t think it’s hyperbole to say that lack of data standardization is perhaps the least appreciated.
Evidence-based policy is also not just a matter of better government. It’s about people’s faith in government in the first place. Results for America recently commissioned a nationally representative survey about Americans’ attitudes about the role of evidence in policymaking. When asked about “what most drives policymakers’ decisions” a whopping forty-two percent said “boosting popularity, or getting votes” while thirty-four percent said it was the influence of lobbyists and just eight percent said it was evidence about what works. Surely these responses are cause for concern.
Fortunately, there are solutions.
To start, in a time when there are seemingly no bipartisan bills, we saw the passage of the Evidence Act—which is known to some as the umbrella bill for the OPEN Government Data Act. As I noted at the beginning, the Evidence Act represents a major step forward not just for the capacity of government agencies to implement evidence-based policy, but for the public to gain access to open, machine-readable data.
Of course, this law is the beginning, not the end. We can help solve private market inefficiencies by calling for more data.
These reforms can and should be viewed as steps to aid the private sector, hopefully leading to better economic outcomes, lessened regulatory burdens, or both.
On the whole, I am clear-eyed about the challenges faced by advocates for evidence-based policy. The passage of the Evidence Act it is clear that progress can be made. To me, it feels like we are on the cusp of a new movement to incorporate data and evidence in all that government does. Together we can help ensure that policy does a better job of incorporating data and evidence, leading to improved lives for all Americans.
In recent years, we have seen an explosion of regulatory technology, or “RegTech.” These solutions have the potential to transform the regulatory reporting process for the financial industry and the U.S. federal government. But RegTech can only thrive if government financial regulatory agencies, like the Securities and Exchange Commission (SEC), the Commodity Futures Trading Commission (CFTC), and the Federal Deposit Insurance Corporation (FDIC), adopt structured open data standards for the forms they collect from the private sector. We have seen changes and momentum for RegTech adoption is picking up, but there is much more to be done.
At this year’s RegTech Data Summit on Tuesday, April 23, in New York, we’ll explore the intersection of regulatory reporting, emerging technology, and open data standards with financial regulators, industry leaders, RegTech experts, academics, and open data advocates.
The Data Coalition has long advocated for RegTech policy reforms that make government regulatory reporting more efficient and less burdensome on both agencies and regulated entities. The benefits are clear. Unified data frameworks support efficient analytical systems; common, open data standards clear the path for more accurate market risk assessments among regulators and bolster transparency.
The Summit comes at an opportune time. Federal financial regulators have already begun replacing document-based filings with open data standards.
Within the past year, the SEC voted to mandate inline eXtensible Business Reporting Language (iXBRL) for corporate financial filings. The Federal Energy Regulatory Commission (FERC) proposed a rule change that would require a transition to XBRL from XML. The House of Representatives held the first-ever hearing on Standard Business Reporting (SBR). The Financial Stability Oversight Council (FSOC) reiterated its recommendation for the adoption of the Legal Entity Identifier (LEI) to improve data quality, oversight, and reporting efficiencies.
There are also a number of international examples of government-wide adoption of open data that can serve as a guide for similar efforts in the U.S., which we will explore at our Summit. Standard Business Reporting (SBR), as successfully implemented by Australia, is still the gold standard for regulatory modernization efforts. By utilizing a standardized data structure to build SBR compliance solutions, the Australian government was able to streamline its reporting processes and save their government and private sector $1 billion AUD from 2015-2016.
The success of SBR in Australia is undeniable. During our Summit, panelists will discuss how Congress is considering policy reforms that will enable the adoption of RegTech solutions to theoretically achieve the same savings as Australia. The Coalition has supported the Financial Transparency Act (FTA) since its introduction (H.R. 1530,115th Congress). The FTA directs the eight major U.S. financial regulatory agencies to collect and publish the information they collect from financial entities in an open data form, electronically searchable, downloadable in bulk, and without license restrictions.
Once financial regulatory reporting is expressed as standardized, open data instead of disconnected documents, RegTech applications can republish, analyze, and automate reporting processes providing deeper insight and cutting costs.
Entity identification systems are another pain point for the U.S. regulatory community. A recent Data Foundation report, jointly published with the Global Legal Entity Identifier Foundation (GLEIF), discovered that the U.S. federal government uses at least fifty distinct entity identification systems – all of which are separate and incompatible.
If widely and properly implemented in the United States, a comprehensive entity identification system based on the LEI could help identify and mitigate risk in financial markets, track and debar low-performing federal contractors, improve supply chain efficiency, and generally be useful anywhere a government-to-business relationship exists. By working together, industry and government leaders can reap the benefits of these emerging RegTech solutions and open data applications.
Karla McKenna, who is Head of Standards at GLEIF and specializes in international financial standards, and Matt Reed, Chief Counsel at the U.S. Treasury’s Office of Financial Research, are among the leading voices we will hear from at the Summit. Together with Ken Lamar, former Special Vice President at the Federal Reserve Bank of New York, and Robin Doyle, Managing Director at the Office of Regulatory Affairs at J.P. Morgan Chase, they will analyze the status of open standards and the impact a single entity identifier.
We’ll be delving into RegTech applications like blockchain, analytic applications, and AI systems, as well as policies that will transform regulatory reporting like the FTA, and more at the second annual RegTech Data Summit on April 23. The Summit will convene financial regulators, industry leaders, academics, and open data advocates to discuss the latest innovations in regulatory technology and what the future holds.
Summit-goers will have the opportunity to hear from SEC, Treasury, FDIC, and J.P. Morgan Chase representatives just to name a few. Featured speakers include former SEC Commissioner Troy Paredes; Dessa Glasser, Principal, The Financial Risk Group and formerly CDO, J.P. Morgan Asset Management; and Mark Montoya, Senior Business Analyst, FDIC.
The Summit will focus on three main themes as we explore the future of U.S. regulatory reporting technology:
It is clear that RegTech solutions will disrupt compliance norms by increasing efficiency, enhancing transparency, and driving analytics. However, successful implementation of this technology is only possible when government and industry focus on collecting, reporting and publishing quality and structured data. If you are eager to explore the future of compliance in which document-based regulatory reporting will become a thing of the past, then join us at the second annual RegTech Data Summit, The Intersection of Regulation, Data, and Technology.
For more information on the Summit, check out our event webpage.
1100 13TH STREET NORTHWEST SUITE 800WASHINGTON, DC, 20005, UNITED STATESINFO@DATAFOUNDATION.ORG
RETURN TO DATA FOUNDATION