Log in

Blog

  • January 10, 2020 9:00 AM | Data Coalition Team (Administrator)

    In the final weeks of 2019, Congress passed a significant piece of legislation, and the president signed the bipartisan bill into law Dec. 30, 2019: the Grant Reporting Efficiency and Agreements Transparency Act of 2019, or the GREAT Act, which will modernize federal grant reporting.

    If your organization is one of the recipients or grantees of the more than $700 billion in annual financial assistance from the federal government, this will impact you. Chances are the GREAT Act will affect you and how your agency, institution, or research lab reports information back to the federal government.

    We’ve been following the GREAT Act closely and are here to help answer your questions surrounding the legislation. Check out the Q&A of what you need to know now—and also what’s next for the GREAT Act, when the president signs it into law as expected.

    What is the GREAT Act?

    Federal grants touch nearly every American through grant programs that provide free or reduced school lunches, equip our police and fire departments, build and maintain our highways and transportation infrastructure, support small businesses, and much, much more.

    The GREAT Act is intended to:

    • Modernize grant reporting by developing government-wide data standards for information reported by grant recipients.
    • Reduce federal grant recipients’ reporting burden and compliance costs by increasing automation and applying new technologies.
    • Increase transparency and strengthen federal agencies’ oversight and management of federal grants by requiring the publication of recipient-reported data collected from all agencies on a single public website.

    “We named it the GREAT Act for a reason,” Rep. Virginia Foxx (R-N.C.) said when she introduced the bill in 2017. “The results of the passage will be great for stakeholders, government agencies, job creators, grantees, and grantors.”

    It is designed to standardize a data structure for the information that recipients must report to federal agencies and to streamline the federal grant reporting process.

    “Without updating the way we process grant reports, in many ways we might as well be still operating with a typewriter and a fax machine and Wite-Out,” Foxx said.

    What do I need to do to prepare for the GREAT Act?

    Based on my experience in implementing the DATA Act—the GREAT Act’s predecessor that requires financial data to be standardized, machine-readable, and transparent—here are three initial steps you can take to be proactive:

    1. Establish a core team responsible for implementation: Identifying the people at your organization who will closely track and be held accountable for implementing the act is your first step. While the key stakeholders in your organization will likely change over time, you need to identify those who will be most operationally affected by the act—this includes everyone from grant administrators to systems/IT owners to program staff.
    2. Engage in developing the data standards: The GREAT Act requires the standard setters to consult with key stakeholders, including states, local governments, and federal agencies. It is important that you and your team stay up-to-date with the latest proposals and provide feedback to prevent burdensome or less than ideal standards. Put simply, now is your chance to shape the future of federal grant reporting.
    3. Conduct an audit of your data and processes: As I mentioned earlier, it is highly unlikely that what you are required to report and/or collect will change substantially. Right now is a good time to identify process inefficiencies, data quality issues, and duplicative efforts. You can start by conducting a simple inventory of the data you submit and/or collect for federal award reporting.

    When taking these first steps, be sure to take an incremental approach. Let’s face it: change is hard. Do the easiest and smallest things first to make meaningful progress. Also, truly look at this as an opportunity to modernize and leverage the GREAT Act as a catalyst to push forward data and efficiency initiatives that may have stalled out in the past.

    How will the GREAT Act impact my current reporting process?

    You probably won’t experience a great deal of change right away, as the act requires the White House Office of Management and Budget (OMB) and the designated “standard-setting agency”—likely the Department of Health and Human Services (HHS)—to develop data standards over the course of the next two years. Once the standards are set, federal grant-making agencies have one year to adopt the standards and issue guidance to recipients for all future information collection requests.

    As mentioned above, you can take this time to implement new best practices that improve processes across the board, so you can not only be fully prepared for when the effective date arrives, but also take advantage of the new efficiencies in advance.

    The law’s provisions require the new data standards be nonproprietary, machine-readable, and be in line with existing machine-readable data standards, including standards set under the Federal Funding Accountability and Transparency Act of 2006 used for USAspending.gov reporting.

    If you are a grant recipient, what you report will likely not change substantially, but how you report will change in ways you might not have expected. More on those in the next section.

    What are the benefits to the GREAT Act?

    Some of the benefits to reporting organizations include a reduction in redundant reporting, potential for automated validation of a report’s completeness, automated delivery, and if you’re a high-performing organization, faster recognition of your excellent performance.

    First benefit: Grant recipients may need to report less data. Recipients will be required to submit data in a machine-readable format, meaning that just sending PDFs will no longer be acceptable. Federal agencies will leverage technology to automate and streamline reporting processes, so recipients would report data only once, and federal agencies would use it multiple times.

    Second benefit: Federal agencies can make more informed decisions when awarding grants. That’s a huge advantage if you are a recipient that consistently receives a clean audit opinion—chances are you will have greater access to federal grant dollars. The GREAT Act simplifies the current single audit reporting by requiring the gold mine of single audit data to be reported in an electronic format consistent with the data standards established under the act.

    But if your single audit reports identify material weaknesses and repeat findings year after year, right now is a good time to get your house in order if you want to keep receiving federal funds.

    What are the important dates I need to know for the GREAT Act?

    Keep in mind this general timeline:

    • Within two years: The federal government must set data standards.
    • Within three years: Once the standards are established, federal agencies will have one year to adhere to the standards, publish guidance to recipients, and explore modern technologies in reporting related to federal awards. In addition, OMB must issue guidance requiring single audit information to be reported in an electronic form consistent with the data standards.
    • Within five years: Once the standards are set and adopted, the federal government must publish the federal award information on a public website.

    Where can I go for more information?

    Our team at Workiva is committed to keeping you informed on the significant news related to the GREAT Act, whether you are a recipient, auditor, or federal grant-making agency. The Data Coalition, America’s premier voice on data policy, is another great resource for staying up-to-date. Check out the Data Coalition website, and subscribe to their newsletter.

    Editor’s note: This blog has been updated and was originally published on December 20, 2019, at workiva.com.


  • December 23, 2019 9:00 AM | Data Coalition Team (Administrator)

    In 2019, tremendous bipartisan efforts led to new federal data laws, progress on legislation that may become law in 2020, and gains inside the Executive Branch for implementing strategies to make government data more accessible and useable. Amidst a backdrop of hyper-partisanship and political turmoil in Washington, DC, the bipartisan approach for improving the federal government’s data policies cannot be understated.

    Here are a few highlights from the past 12 months on areas of success for Data Coalition priorities:

    • New Law Enacted with Comprehensive Data Governance Framework. In January 2019, the President signed the Foundations for Evidence-Based Policymaking Act (Evidence Act), which includes the OPEN Government Data Act and the Confidential Information Protection and Statistical Efficiency Act. The Data Coalition lobbied for the successful passage of this legislation that contains half of the recommendations made by the U.S. Commission on Evidence-Based Policymaking and overhauls the federal government’s data infrastructure to prioritize data governance and management. The law also established a new c-suite role at each federal agency — the chief data officer — and recognizes new leadership roles for evaluation and statistics. Congress even included additional funding in the final FY 2020 appropriations at the Data Coalition’s request to support implementation efforts.
    • Unanimous Congressional Approval of Grant Modernization. In December 2019, Congress unanimously passed the Grant Reporting Efficiency and Agreements (GREAT) Act; it is expected to be signed by the President soon.  The bill amends existing law to transform federal grant reporting for the modern era by directing federal agencies to improve and streamline how information is collected and used.
    • Advancement of Processes for Innovative Data Applications in Government. While more than 50 bills related to artificial intelligence are under consideration in Congress, the Data Coalition’s priority the AI in Government Act advanced through committees in both chambers of Congress with bipartisan support. The bill will establish processes and guidance for responsibly and ethically applying AI across government.
    • Progress on Ensuring an Accurate 2020 Census. In 2019, Congress provided additional funding to support the Census Bureau’s herculean count of the American population in 2020, and with the endorsement of the Data Coalition, a bipartisan resolution rapidly progressed in the Senate to promote an effective census.
    • Renewed Dialogue and Enthusiasm for Entity Identification in the U.S. Through the Data Coalition’s advocacy efforts, a new conversation is underway in Congress about the strategy for adopting legal entity identifiers in the country’s financial services sector. The Financial Transparency Act was refiled earlier this year with a senior Democratic and senior Republican co-sponsoring the legislation.
    • Development of the Federal Data Strategy and 1-Year Action Plan. The White House finalized its Federal Data Strategy, including activities in partnership with the Data Coalition, outlining an implementation strategy for the Evidence Act, support for AI research, and an enhanced governance framework that will continue to be a priority in coming years.

     

    Other Data Coalition priorities are also advancing as work proceeds to establish a federal data service, Congress considers the Taxpayer Right to Know Act developing program inventories, the National Security Commission on AI produces its final recommendations, the Advisory Committee on Data for Evidence Building is established and begins to deliberate, and as our country’s lawmakers consider reforms through the Select Committee on Modernization of Congress.  

    In short, the Data Coalition’s advocacy efforts paid off tremendously this year – and are continuing to accrue benefits – for the American people and our society. Through nearly 100 briefings on key policy priorities, the GovDATAx Summit, and countless other activities this year that bring together the corners of the data community, the Data Coalition’s members and expertise achieved real and lasting headway for the country.

    In 2020 and beyond, as agencies work to implement the Evidence Act, the GREAT Act, and more, the Data Coalition members and staff will continue to serve as a resource to hold government accountable for effective implementation while also devising strategies for continuous improvement. The moment for more meaningfully transforming government data into a strategic asset is upon us and we hope the continued enthusiasm and support for the Data Coalition’s efforts will sustain this momentum in the coming years. More importantly, have confidence that in 2020 conversations about better using government data as an asset will continue to bring together Republicans and Democrats as we work to achieve common goals for improving society and meeting the needs of the American people.


  • December 09, 2019 9:00 AM | Data Coalition Team (Administrator)

    The United States government is rapidly progressing in implementing new laws and guidance on evidence-based policymaking, data-driven government, and open data. During the Data Coalition’s GovDATAx Summit in 2019, Department of Commerce Deputy Secretary Karen Dunn Kelley suggested a metaphorical book on evidence-based policymaking for the U.S. is being written, with new chapters every year. In chapter 1 in 2017, the U.S. Commission on Evidence-Based Policymaking wrote a seminal report on better using government data. In Chapter 2 in 2018, Congress passed the Foundations for Evidence-Based Policymaking Act (Evidence Act), and in Chapter 3 the Executive Branch published its Federal Data Strategy.

    What’s the next chapter? 

    Data Coalition members met with senior leadership at the Department of Commerce to discuss Chapter 4 leading into 2020 and beyond. In addition, groups like Project Evident, the Brookings Institution, and the University of Chicago’s Center for Impact Sciences are all developing plans for the next generation of evidence-based policymaking.

    Key aspects and inputs for the next chapter will include:

    • The one-year action plan for the Federal Data Strategy expected to be released in mid-December, which serves as one of the implementation vehicles for the Evidence Act.
    • Recommendations developed by the new Advisory Committee on Data for Evidence-Building hosted at the Department of Commerce, which will begin meeting in early 2020.
    • Plans to develop a national Data Service to promote privacy-protective uses of data and applications of cutting-edge technologies inside government, building on funding expected to be included in the final fiscal year 2020 appropriations.
    • Publication of agency draft plans and policies for implementing key provisions of the Evidence Act, including some signals as early as February 2020 in the President’s Budget.

    Needless to say, there is much work to do to ensure our government increasingly adopts evidence-based and data-driven approaches.

    What can those outside government do to support evidence-based policymaking?

    As the next chapter of the evidence and data movement is written, stakeholders in non-profits, academia, and the private sector can all contribute. Action items could include:

    • Offer Proactive Suggestions. For agencies that work on priority issue areas, provide suggestions about critical datasets, gaps in data quality, needed data standards, or programs that should be evaluated. Suggestions can be provided directly to new chief data officers, evaluation officers, and statistical officials in federal agencies, even before they are requested.
    • Express Support. As work continues to change agency cultures to better recognize government data as an asset, agency leadership will benefit from sustained reminders and expressions of support from non-governmental stakeholders about the need to prioritize implementation of the Evidence Act and the Federal Data Strategy.
    • Participate in Implementation. As agencies begin to publish learning agendas, plans for open data, data governance processes, and budgets that outline how data management activities will be prioritized, stakeholders outside government should actively participate by providing feedback on the plans and policies.

    As the federal government continues to improve data quality, access, and ease of use, the Data Coalition will continue to support its members in engaging in each of the action items as the next chapter is written, and beyond.


  • September 18, 2019 9:00 AM | Data Coalition Team (Administrator)

    One of the largest data collection exercises undertaken by the United States government every 10 years is about to begin – the decennial census. In June 2019 the Data Coalition announced its formal partnership with the U.S. Census Bureau to help promote an accurate count, with support from the business community.

    Data Coalition members met with U.S. Census Bureau Deputy Director Ron Jarmin to discuss the inflection points as the country prepares for the 2020 census, in addition to how the business community’s technology and data analytics firms can be more involved. Here are three key takeaways from the discussion:

    #1: The benefits of a high-quality, accurate count are both clear and significant. 

    The country needs a high-quality census count to support a range of activities beyond allocating representation in Congress, from evidence-based policymaking in government to data-driven decision-making in the private sector. Decision-makers throughout government rely on census to allocate grant funds, determine the number of households in a region for assessing program needs, and also use the information to benchmark other surveys, such as the American Community Survey. Similarly, the business community relies on census data to determine where to locate new offices or retail establishments and to inform market insights. 

    An accurate count is also a difficult feat because some populations are especially difficult to reliably count. Populations include migrant communities, children under five, and homeless individuals, just to name a few. All of these populations are important to capture because as population dynamics shift, government and businesses must be able to respond accordingly to ensure public and private services are efficient, effective, and meet the expectations of the American people. 

    #2: Census privacy and confidentiality protections are strong.

    While there has been much discourse over the past year about how certain data may be used in the contemporaneous environment to support decision-making, the Census Bureau data must be held in confidence. 

    The Census Bureau is only allowed to release summary statistics or information that does not include personal identifiers to the public. Any violation of this policy is also a violation of two separate laws – the Census Act and the Confidential Information Protection and Statistical Efficiency Act of 2018 – potentially carrying $250,000 in fines and up to 5 years imprisonment under each of the laws for each offense. Needless to say, the safeguards in law and throughout the Census Bureau’s professional staff are taken seriously and the American public should be assured that their confidential data will be strictly protected. 

    #3: Every organization can – and should – play a role in supporting an accurate count.

    There are numerous tactics that businesses and non-profit organizations can use to support an accurate census count. But whether large or small, all organizations can play a role in supporting the census. Potential examples of activities could include promoting the census on websites, encouraging employees to respond to the census in 2020 with emails or even breakroom posters, all the way through targeted support services to meet particular needs. 

    Data Coalition member Esri published a resource in July 2019 explaining relevant methodological and technology tools for supporting the geospatial capabilities needed for the census. Another Data Coalition member, Tableau, is supporting the Census Bureau’s efforts to track response rates once the census begins, so that local community organizers can have efficient metrics to support their efforts. Deloitte Consulting offers a variety of IT and management support roles to encourage support efficient execution of the 2020 census. New member USAFacts is working to promote the new features of next year’s census. The Census Bureau continues to search for partners in the business community for the 2020 census. 

    Other data and technology partners are critical for supporting the census as social media and internet efforts have rapidly advanced over the past decade. The 2020 census will allow responses through the internet making responding easier than ever, but the risk of misinformation campaigns and the presence of cybersecurity threats are real. Technology and data companies can help support the census in reducing the risks for executing a census in the modern world. 

    The Data Coalition will continue to support the effort to ensure the 2020 census results in useful data for the American people. 


  • September 06, 2019 9:00 AM | Data Coalition Team (Administrator)

    Over the past two years, the prospect of the United States government and key decision-makers becoming more steeped in evidence-based policymaking has become increasingly bright. 

    On September 7, 2017, the U.S. Commission on Evidence-Based Policymaking (Evidence Commission) released its final report to the President and Congress with a strategy for better using the data that government already collects. The report contained 22 unanimous recommendations that focused on responsibly improving access to government data, strengthening privacy protections, and expanding the capacity to generate and use evidence.

    Progress on Fulfilling the Commission’s Vision

    While action has not yet been taken on all of the Evidence Commission’s recommendations, significant progress has occurred over the past two years. Here are some key highlights of what transpired in the last two years:

    • Evidence Act. The Foundations for Evidence-Based Policymaking Act of 2018 (Evidence Act), enacted in January 2019, addresses half of the commission’s suggestions. It includes new directives around establishing federal government leadership roles for data management, program evaluation, and statistical expertise. The Evidence Act establishes new expectations for open data, data inventories, and improved data management. It also reinforces one of the strongest privacy and confidentiality laws in the country: the Confidential Information Protection and Statistical Efficiency Act, which ensures that when government pledges confidentiality to the American public all steps are taken to appropriately protect data. 
    • Federal Data Strategy. In parallel to the Evidence Act, the White House’s management agenda includes the development of a Federal Data Strategy to recognize data as a strategic asset. The agenda prominently incorporates many of the concepts and approaches developed by the commission in the 10-year plan for improving government data access, management, and use. While the formal action plan directing agencies is not yet final, it is expected to reinforce many of the recommendations from the Evidence Commission included in the Evidence Act. 
    • Guidance to Agencies. In summer 2019, the White House’s Office of Management and Budget (OMB) issued multiple guidance documents to agencies about addressing certain Evidence Commission recommendations. These included designating evaluation officers, appointing chief data officers, and identifying statistical experts, developing “learning agendas,” and incorporating new actions into annual budget and performance plans. 
    • Individual Agency Actions. In many cases, agencies started making progress in implementing the Evidence Commission recommendations even before guidance was issued. For example, the Small Business Administration developed and released its enterprise learning agenda in early 2018 and the Department of Health and Human Services developed an agency-specific data strategy released earlier in 2019.

    Next Steps on Fulfilling the Commission’s Vision

    The Evidence Commission set the stage for monumental changes in government data laws, processes, and culture. Agencies have initiated wholesale overhauls of data management practices and the recommendations are quickly becoming reality.

    But much work remains to fulfill the bipartisan vision outlined by the Evidence Commission – that government data are meaningfully analyzed to produce credible evidence that is actually used to inform policymaking. In the coming months and years, here are five areas for further attention:

    1. Development and Authorization of the National Secure Data Service. The commission’s headline recommendation was alluded to in the Evidence Act with the establishment of an advisory committee to develop a detailed roadmap, but implementing the National Secure Data Service will require Congress and the President to provide appropriate legal authority to ensure the envisioned data linkage capabilities adequately meet the American public’s privacy expectations.
    2. Improvement to Access for Income and Earnings Data. The commission prominently highlighted that income data are used as a valuable outcome metric for evaluating a wide range of federal programs and policies. Yet, these data are among the most difficult to access, even when privacy guarantees are in place. Additional efforts are needed to enable researcher access to the National Directory of New Hires as well as other targeted improvements to this information that serves as a valuable proxy measure for improvements to quality of life. 
    3. Reforms to Existing Government Processes. The commission highlighted existing limitations of government’s data collection requirements and some issues for allocating funding to meet specific data and evidence needs. Clear enhancements to the Paperwork Reduction Act are still needed.
    4. Exploration of New Technologies. While some new research on emerging technologies for safely conducting data sharing activities occurred in the past two years, there are critical gaps that remain. Government agencies and the private sector must deliberately invest in needed research to enable effective data management and use. 
    5. Sustained Attention for Effective Evidence Act Implementation. The broad requirements for federal agencies embodied in the Evidence Act will only be successful with sustained attention from senior agency leaders and with adequate funding. Congress must ensure agencies receive appropriate resources in the next fiscal year to capitalize on the momentum for improving data quality, increasing the availability of open data, and developing useful analytics and evaluation for policymakers. 

    Today, the Evidence Commission’s legacy can be celebrated as a substantial accomplishment in developing a sound and actionable strategy for better using government data. While more attention is needed to change government’s culture and operations to be more evidence-based, the early steps to better manage and use data are exceedingly promising.


  • September 04, 2019 9:00 AM | Data Coalition Team (Administrator)

    Guest blog by Jane Wiseman, Institute for Excellence in Government.

    As a fellow at the Harvard Kennedy School helping to support a national network of state chief data officers, I’ve had a front row seat to leading edge analytics for the past three years.  As a more recent observer of chief data officers in the federal government, I’ve been impressed by the excellence and diversity of innovative data work being done by the pioneering data officers in federal service.  

    While reading the Federal Data Strategy Action Plan, I was inspired by how detailed and thoughtful it is.  Actions address a range of important topics, and it’s wonderful to see data governance and data literacy getting attention, as these topics are not glamorous and sometimes get too little focus at the state and local level.  There’s an aggressive timeline in the action plan and a remarkable amount has already been accomplished.

    I was honored to share my thoughts at the Federal Data Forum hosted by the Data Coalition and the White House in early July, and was energized by the range of experts who shared their ideas. The simple fact that OMB is creating a data strategy for the federal government is one of the most exciting developments in data-driven government in my career and offers a tremendous opportunity to deliver more value for taxpayers.  

    Listening to other speakers during the forum, I was surprised twice – first at the common concern about the gap between the needed data literacy for government managers and current skill levels, and secondly by how few voices were calling for agencies to create and publish their data strategy.  

    Observation #1: The federal government needs to invest in data literacy.  

    Most leaders and decision-makers in government are not digital natives and many lack confidence in their data and digital skills.  And this gap will slow the adoption of data-driven government.  

    • With 2.5 quintillion bytes of data created every day, we are swimming in data.  Unfortunately, few senior leaders are fully comfortable with their own data knowledge and skills, or their organization’s capacity for data use – 4 out of 5 government leaders say decision-making is either somewhat or rarely data-driven according to a PwC study.
    • There are 90,000 units of local government in the US. Only a small percentage have a designated data leader such as a chief data officer (CDO), and only 20 have chief data officers who are part of the Civic Analytics Network (CAN), a peer network of leading state government data leaders.  
    • Data-driven government is about decision-making.  But if decision-makers are not data-literate, you can have all the data scientists in the world in an agency but decisions won’t use their results – it will be as if the results are in Greek and no one translated.

    Executives need to know how to ask for data analysis (what’s possible and what’s not), and how to critique results. They don’t need to be able to code or build an algorithm or map themselves, but they should know the power and capability of modern data tools. Basic data literacy means knowing how to ask good questions that inspire analysis, and then having the confidence to interpret and use the results. Let’s take a comparative study that was done between the US and 23 countries. In a comparison of the skills of the adult workforce in the US to those in 23 countries on "literacy, numeracy, and problem-solving skills," Japan and Finland led the pack on numeracy skills, while the United States ranked a disappointing 21 out of 23 participating countries. To close this achievement gap and to move toward large-scale organizational change to data-driven government will take a variety of training and coaching offerings. And it will take time.  

    Recommendation: The federal government should provide a full suite of data literacy training, support and coaching for senior leaders and management executives in government so that they have the confidence to lead data efforts.

    Observation #2: Each agency should have their own data strategy and plan. 

    As the saying goes, “If we don’t know where we’re going we might end up somewhere else.”  

    In the Federal Data Strategy Action Plan, actions 12-16 call for parts of what would be agency-specific data strategies. But this call to action falls short of asking each agency head to publish a multi-year data strategy and to report annually on progress toward achieving that strategy. We need to know where each agency is going in order to achieve data-driven decision making at every level of the organization and across all bureaus and agencies. Without a long-term strategy, we can’t hope for permanent culture change.  

    The elements that exist in the action items — from data governance to open data to important data questions —  need to be knit together into a multi-year plan for optimizing the use of data. Targets also must be set for each that can be reported on an annual basis in a cohesive integrated, department-wide plan.   

    With a clear charter from the chief executive, and armed with a strategy, a chief data officer can define a roadmap describing the difference the chief data officer team can make in government over a three- to five-year time horizon. The best chief data officers at federal, state and local levels operate from a strategy, a guiding document that sets mission and vision. They share their strategies and that helps them stay focused, and helps communicate to others what is and is not expected (e.g. chief data officers are not the one you call when you need someone to fix the printer and have an email server issue).  

    Agency heads must be the individuals ultimately responsible for their data strategy and must invest their chief data officers with the authority and resources to carry it out. It needs to be clear to all the chief executive has a strong relationship with and relies on data, and views the chief data officer as a trusted and important resource to the organization.  

    Strategy is about mission and outlining key questions. Asking good questions matters and clearly defining how the analytics research question will generate public value is important. The importance of understanding the connection to priority policy questions is summed up well by one of the CDOs I interviewed last year who said, “You might be Lord Algorithm but if you don’t stop to understand the problem, you will never succeed in making government better.”   

    When the UK created the Government Digital Service in 2011, their focus was on long-term transformation of government to digital services, and they’ve made consistent progress every year.  But it didn’t all happen overnight. One of the keys was to make sure every agency had a point person and that he or she had a roadmap. We need that same level of focus on individual federal agency level strategies.  

    Recommendation: OMB should hold agency heads accountable on an annual basis for progress on achieving data-driven government, should require them to be the leader of their department’s multi-year strategic plan for data, and require the publication of their data strategy.  

    Successful Implementation Holds Promise for Lasting Impact 

    With this exciting momentum at the federal level toward becoming more data-driven in decision-making, there is a tremendous opportunity for the federal government to support the development of capacity in state and local government as well.  For example:  

    • The tools created for federal agencies will benefit state and local governments, where there is far less capacity to build knowledge capital and to document tools and methods.  A great example is the data ethics framework and data protection toolkit called for in actions 3 and 4 in the federal data strategy.
    • The federal government should create Data Analytics Hubs or Centers of Excellence in a service model, ensuring state and local governments do not have to replicate analytics capabilities Rather, major economies of scale can be achieved if the federal government establishes analytics services hubs to provide data analysis for local governments.
    • Procurement expertise and advice could also be part of these Centers of Excellence. With the rapid growth of the analytics and data mining vendor field, it would be impossible for any local official to have time to stay abreast of trends and to know the right questions to ask to assure that procurements are able to properly protect data privacy and security and to properly take into account data ethics.  A central resource at the federal level could support better investments at the state and local level and reduce the risk of data breaches and improper or wasteful spending of state and local funds for data analytics projects. 

    The federal government has a unique opportunity at this moment to help incubate and advance the field with actions in every agency. Leadership and investment in capacity now will pay dividends long into the future.  

    The Federal Data Strategy, with an iterative and collaborative approach, should support agencies moving forward. If it’s done right, the strategy will lead to a major transformation of government.  


  • August 06, 2019 9:00 AM | Data Coalition Team (Administrator)

    Whether government should use data to improve society is no longer up for debate – the answer is definitively yes. When high-quality government information is accessible, it can be applied to generate insights that help decision-makers develop better policies for the American people. We’ve seen successes in early education, disease prevention, and government transparency efforts, and more could be done with better access to data.

    For too long, our government operated in silos to address some of the issues related to data access and use, without planning for a wide range of data users. Too frequently, the data community reinforces its own silos rather than working in collaboration. That is why the Data Coalition is launching GovDATAx Summit in 2019

    Our inaugural GovDATAx Summit will be the beginning of a renewed dialogue about how we improve government’s data policies to truly unleash the power of data for the public good. The data community should work to empower innovators to generate insights that improve our economy and the quality of life for the American public. The conversation at GovDATAx is intended for an audience that:

    • Recognizes data as a strategic asset;
    • Prioritizes government’s need to establish an environment for accessing information; and
    • Values responsible and ethical uses of data. 

    During the Summit’s main sessions, experts from the White House, agency leadership, academia, and the private sector will discuss important new bipartisan policies that are being implemented this year, like the Evidence Act, which establishes new data leaders – chief data officers, evaluation officers, and statistical experts – across the federal government. GovDATAx will feature a discussion of important data standards that lead to better data quality, promote opportunities for public-private partnerships, and present exemplars about what works for using data as an asset. 

    Be a part of shaping the future of government data. Join the Data Coalition and hundreds of leaders from across the public sector, businesses, non-profits, and academia on Wednesday, October 30 to discuss the next steps for developing policies that unleash data for good. 


  • July 10, 2019 9:00 AM | Data Coalition Team (Administrator)

    The American public provides an incredible amount of information to the federal government – valued at $140 billion each year. This information is provided as part of businesses complying with regulations, individuals and firms paying taxes, and individuals applying for programs. 

    The development of the Executive Branch’s Federal Data Strategy is an effort to better organize and use all of that information to improve decision-making. On July 8, 2019, the Data Coalition joined the White House to co-sponsor a public forum gathering feedback on what actions the federal government will undertake over the next year to begin implementing the data strategy. 

    Kicking off the event, Dr. Kelvin Droegemeier, Director of the White House’s Office of Science and Technology Policy, stressed a key goal of the strategy is to “liberate” government data for society’s use, noting that nearly 90 percent of government data go unused. During the forum, 52 speakers – including 12 members of the Data Coalition – and more than 100 other experts provided input on how to improve the draft plan. 

    Here are fourtake-aways from the public comments provided during the forum:

    #1. Leadership is Essential for Realizing Culture Change

    New legislation enacted in early 2019 creates several new positions in government agencies, such as the chief data officers, evaluation officers, and statistical experts. Throughout the public forum, speakers stressed the need for these leaders to be empowered to institute changes within federal agencies, and informed about how to most effectively implement best practices for data access, management, and use.  

    Several speakers specifically stressed the critical role for newly established chief data officers in improving data quality and usefulness across government, in addition to providing improved training and tools for agencies to equip the federal workforce to use data. The concept of data literacy was also prominently featured, meaning that throughout federal agencies the workforce should be trained routinely about responsibilities and techniques for responsibly managing and using data. 

    #2. Targeted Data Standards Offer Opportunities for Efficiency

    The need for improved data standards was discussed by speakers on more than half the panels during the event, with suggestions that the Federal Data Strategy could do more to encourage standards in the areas of financial regulatory reporting, agency spending and grant management, geospatial data, and organizational entities. For example, multiple speakers highlighted the potential opportunity to include more direction about the adoption of common business entity identifiers, like the globally-recognized Legal Entity Identifier (LEI) created, as a means of improving analytical capabilities while also reducing reporting burdens from regulated entities. 

    #3. Partnerships are an Important Element for Success

    Many speakers noted their appreciation for the opportunity to provide feedback on the data strategy, and encouraged ongoing collaboration with those outside government through public-private partnerships. As the strategy is implemented over the next year, industry, non-profits, academics, and others in the American public should have opportunities to weigh in and hold agencies accountable for achieving the stated goals in the plan. Partnerships also offer agencies a specific means to coordinate with those outside of government to ensure the implemented policies and practices achieve meaningful improvements in the country’s data infrastructure. 

    #4. Coordination at OMB and Agencies is Key

    Finally, because government data collected by one agency are often relevant for another, coordination is a critical component of success in the Federal Data Strategy. Speakers highlighted that the proposed OMB Data Council could serve as a model for agencies about how to work across interests, laws, and policy domains to achieve lasting change. But coordination must be achieved or promoted not just by OMB; it is also a responsibility for every agency to coordinate within its own programs and leaders to promote culture change, a data literate workforce, and to allocate resources to achieve the goals of the strategy.

    In the coming months, the action plan will be finalized and publicly released, incorporating the comments from the Coalition-White House public forum along with other written feedback. The Data Coalition looks forward to continuing to partner with the federal government to ensure our national data policies truly make data an asset for the country.  

    To read the Data Coalition’s comments on the Strategy’s Draft Action Plan, click here.


  • May 16, 2019 9:00 AM | Data Coalition Team (Administrator)

    The following comments were adapted from the RegTech Data Summit Keynote Address of Ben Harris, Chief Economist, Results for America – Former Chief Economist and Economic Advisor to Vice President Joe Biden. Delivered April 23, 2019 in New York City.

    In a time of seemingly insurmountable partisanship, Congress was able to come together around the issue of evidence-based policy and pass the Foundations for Evidence-Based Policymaking Act (Evidence Act) that makes some dramatic changes to our ability to learn from data and evidence in our quest for better policy. As many of you may know, the OPEN Government Data Act—which was included in the Evidence Act—implements some remarkable changes, including:

    • Installing Chief Data Officers at all federal agencies
    • Documenting and coordinating the massive breath of data collected by agencies; and
    • Directing that non-sensitive government data be open by default.

    To start, it’s probably worthwhile to acknowledge that issues like data standardization and calls for access to open data might not be the sexiest topics, but we all can appreciate their importance.

    As a newcomer to this topic, I have only begun to understand and appreciate the need for widespread access to standardized, machine-readable data.

    The need for standardized and accessible data is crucial, but also the critical need to inject more evidence-based policy into our system of legislation and regulation.

    I have come to believe that our whole economy, not just government regulators, face a massive information deficit. Our economy, which now runs on data and information more than ever, still has gaping holes in the availability of information that undermines markets and can lead to widely inefficient outcomes.

    When it comes to evidence-based policymaking, our government has a long way to go. As pointed out in a 2013 op-ed in The Atlantic by Peter Orszag and John Bridgeland, only about one percent of our federal dollars are allocated based on evidence and evaluations. From my perspective, this is about ninety-nine percentage points too few.

    The lack of evaluation can inject massive and longstanding inefficacies into our federal, state, and city-level budgets resulting in wasteful spending and missed opportunities to improve lives. This is never more evident than in our country’s $1.5 trillion tax expenditure budget. We have hardly stopped to ask whether the $1.5 trillion spent annually in targeted tax breaks are achieving their desired objectives.

    The benefits of better evidence and data extend well-beyond direct spending and tax administration. It can mitigate the economic pain caused by a recession. Indeed, the severity of the financial crisis was exacerbated by the information deficit in the wake of Lehman’s collapse and the inevitable chaos that followed. Had financial firms and regulators been able to more accurately and quickly assess the extent of the damage through standardized financial data, we would have seen less radical actions by investors to withdraw from credit risk and more effective government intervention. Of all the factors that played a role in the crisis, I don’t think it’s hyperbole to say that lack of data standardization is perhaps the least appreciated.

    Evidence-based policy is also not just a matter of better government. It’s about people’s faith in government in the first place. Results for America recently commissioned a nationally representative survey about Americans’ attitudes about the role of evidence in policymaking. When asked about “what most drives policymakers’ decisions” a whopping forty-two percent said “boosting popularity, or getting votes” while thirty-four percent said it was the influence of lobbyists and just eight percent said it was evidence about what works. Surely these responses are cause for concern.

    Fortunately, there are solutions.

    To start, in a time when there are seemingly no bipartisan bills, we saw the passage of the Evidence Act—which is known to some as the umbrella bill for the OPEN Government Data Act. As I noted at the beginning, the Evidence Act represents a major step forward not just for the capacity of government agencies to implement evidence-based policy, but for the public to gain access to open, machine-readable data.

    Of course, this law is the beginning, not the end. We can help solve private market inefficiencies by calling for more data.

    • When it comes to better understanding the fees charged by financial advisers, the U.S. Securities and Exchange Commission (SEC) can amend Form-ADV to include explicit questions on fees charged. It’s that simple.
    • When it comes to evaluating government programs, I can think of no more powerful tool than providing federal agencies a 1 percent set-aside for evaluation. Results for America has called for this for years, and it’s time that Congress pick up the charge.
    • When it comes to evaluating the $1.5 trillion tax expenditure budget, we’ll have to make some institutional changes. One option is to expand the capacity of a federal entity, like the Internal Revenue Service (IRS) or the White House Office of Management and Budget (OMB), to include periodic evaluations of this budget. Another is to call for regular Congressional approval, similar to the process for appropriations.
    • And as we prepare for the possibility of the next recession, we also need to finish the progress made in earnest to make adoption of Legal Entity Identifiers (or LEIs) ubiquitous across the financial sector. While the progress since the great recession has been impressive, we have more work to do to ensure this system covers not only entities in the U.S., but our economic allies as well.

    These reforms can and should be viewed as steps to aid the private sector, hopefully leading to better economic outcomes, lessened regulatory burdens, or both.

    On the whole, I am clear-eyed about the challenges faced by advocates for evidence-based policy. The passage of the Evidence Act it is clear that progress can be made. To me, it feels like we are on the cusp of a new movement to incorporate data and evidence in all that government does. Together we can help ensure that policy does a better job of incorporating data and evidence, leading to improved lives for all Americans.


  • April 05, 2019 9:00 AM | Data Coalition Team (Administrator)

    In recent years, we have seen an explosion of regulatory technology, or “RegTech.” These solutions have the potential to transform the regulatory reporting process for the financial industry and the U.S. federal government. But RegTech can only thrive if government financial regulatory agencies, like the Securities and Exchange Commission (SEC), the Commodity Futures Trading Commission (CFTC), and the Federal Deposit Insurance Corporation (FDIC), adopt structured open data standards for the forms they collect from the private sector. We have seen changes and momentum for RegTech adoption is picking up, but there is much more to be done.

    At this year’s RegTech Data Summit on Tuesday, April 23, in New York, we’ll explore the intersection of regulatory reporting, emerging technology, and open data standards with financial regulators, industry leaders, RegTech experts, academics, and open data advocates.

    The Data Coalition has long advocated for RegTech policy reforms that make government regulatory reporting more efficient and less burdensome on both agencies and regulated entities. The benefits are clear. Unified data frameworks support efficient analytical systems; common, open data standards clear the path for more accurate market risk assessments among regulators and bolster transparency.

    The Summit comes at an opportune time. Federal financial regulators have already begun replacing document-based filings with open data standards.

    Within the past year, the SEC voted to mandate inline eXtensible Business Reporting Language (iXBRL) for corporate financial filings. The Federal Energy Regulatory Commission (FERC) proposed a rule change that would require a transition to XBRL from XML. The House of Representatives held the first-ever hearing on Standard Business Reporting (SBR). The Financial Stability Oversight Council (FSOC) reiterated its recommendation for the adoption of the Legal Entity Identifier (LEI) to improve data quality, oversight, and reporting efficiencies.

    There are also a number of international examples of government-wide adoption of open data that can serve as a guide for similar efforts in the U.S., which we will explore at our Summit. Standard Business Reporting (SBR), as successfully implemented by Australia, is still the gold standard for regulatory modernization efforts. By utilizing a standardized data structure to build SBR compliance solutions, the Australian government was able to streamline its reporting processes and save their government and private sector $1 billion AUD from 2015-2016.

    The success of SBR in Australia is undeniable. During our Summit, panelists will discuss how Congress is considering policy reforms that will enable the adoption of RegTech solutions to theoretically achieve the same savings as Australia. The Coalition has supported the Financial Transparency Act (FTA) since its introduction (H.R. 1530,115th Congress). The FTA directs the eight major U.S. financial regulatory agencies to collect and publish the information they collect from financial entities in an open data form, electronically searchable, downloadable in bulk, and without license restrictions.

    Once financial regulatory reporting is expressed as standardized, open data instead of disconnected documents, RegTech applications can republish, analyze, and automate reporting processes providing deeper insight and cutting costs.

    Entity identification systems are another pain point for the U.S. regulatory community. A recent Data Foundation report, jointly published with the Global Legal Entity Identifier Foundation (GLEIF), discovered that the U.S. federal government uses at least fifty distinct entity identification systems – all of which are separate and incompatible.

    If widely and properly implemented in the United States, a comprehensive entity identification system based on the LEI could help identify and mitigate risk in financial markets, track and debar low-performing federal contractors, improve supply chain efficiency, and generally be useful anywhere a government-to-business relationship exists. By working together, industry and government leaders can reap the benefits of these emerging RegTech solutions and open data applications.

    Karla McKenna, who is Head of Standards at GLEIF and specializes in international financial standards, and Matt Reed, Chief Counsel at the U.S. Treasury’s Office of Financial Research, are among the leading voices we will hear from at the Summit. Together with Ken Lamar, former Special Vice President at the Federal Reserve Bank of New York, and Robin Doyle, Managing Director at the Office of Regulatory Affairs at J.P. Morgan Chase, they will analyze the status of open standards and the impact a single entity identifier.

    We’ll be delving into RegTech applications like blockchain, analytic applications, and AI systems, as well as policies that will transform regulatory reporting like the FTA, and more at the second annual RegTech Data Summit on April 23. The Summit will convene financial regulators, industry leaders, academics, and open data advocates to discuss the latest innovations in regulatory technology and what the future holds.

    Summit-goers will have the opportunity to hear from SEC, Treasury, FDIC, and J.P. Morgan Chase representatives just to name a few. Featured speakers include former SEC Commissioner Troy Paredes; Dessa Glasser, Principal, The Financial Risk Group and formerly CDO, J.P. Morgan Asset Management; and Mark Montoya, Senior Business Analyst, FDIC.

    The Summit will focus on three main themes as we explore the future of U.S. regulatory reporting technology:

    • Enterprise Digitization: The modern enterprise faces a myriad of internal and external data challenges. By internally aligning common data formats and adopting open standards, financial institutions can build a competitive information foundation to more efficiently leverage emerging technology.
    • Open Data Standards: Adopting a single, open data standard for entity identification among U.S. regulatory agencies would create a framework for financial institutions and regulators to more accurately assess market risk, improve reporting efficiencies, lower transaction costs, and improve data quality.  
    • Reporting Modernization: By adopting open data standards, the U.S. government will be able to improve oversight and provide higher levels of accountability to citizens; facilitate data-driven analysis and decision making in agencies; and expand the use of automation, which will reduce compliance costs.

    It is clear that RegTech solutions will disrupt compliance norms by increasing efficiency, enhancing transparency, and driving analytics. However, successful implementation of this technology is only possible when government and industry focus on collecting, reporting and publishing quality and structured data. If you are eager to explore the future of compliance in which document-based regulatory reporting will become a thing of the past, then join us at the second annual RegTech Data SummitThe Intersection of Regulation, Data, and Technology.

    For more information on the Summit, check out our event webpage


1100 13TH STREET NORTHWEST SUITE 800
WASHINGTON, DC, 20005, UNITED STATES
INFO@DATAFOUNDATION.ORG

RETURN TO DATA FOUNDATION

Powered by Wild Apricot Membership Software