Blog
The release of the President’s annual budget proposal offers insights into upcoming priorities and initiatives across the federal government. The Trump Administration’s fiscal year 2021 budget proposal offers some perspectives about plans for implementing new data and evidence projects across government. This is the first full budget since enactment of the Foundations for Evidence-Based Policymaking Act (Evidence Act), including the OPEN Government Data Act, and the final Federal Data Strategy, so it is the first opportunity for agencies to appeal to Congress for support through new authority and resources.
Here are six key take-aways from the data and evidence priorities in the 2021 budget proposal:
More than one year after enactment of the Evidence Act and with the first annual action plan issued by the White House’s Office of Management and Budget (OMB), agencies are now expected to begin rapid implementation of core data and evidence obligations. The budget signals a new focus on tangible projects happening across agencies covering issues as wide-ranging as new program evaluations to open data planning and development of agency data inventories.
While OMB is expected to issue additional guidance in 2020 and develop a second-year action plan as part of the Federal Data Strategy, the budget proposal highlights that planned improvements are no longer just plans, but that real actions are underway. For example, the General Services Administration’s (GSA) Office of Evaluation Sciences recognized its role in supporting new agency evaluation officers in developing learning agendas, and is restructuring some of its activities to plan accordingly.
The Evidence Act requires agencies to undertake a wide range of activities, further elaborated on in the Federal Data Strategy. Even before complete guidance is available from OMB, multiple agencies demonstrate rapid progress implementing the Evidence Act with explanations in budget justifications. For example, the Treasury Department included descriptions of available evidence and analytical projects for every bureau in the congressional justification, some with insightful projects. Other agencies describe new organizational processes and structures for ensuring data leaders receive sufficient support and empowerment. The Departments of Commerce and Education provide extensive details about the roles and organization of new chief data officers in supporting data governance boards and management activities.
Some agencies appear to be forward-leaning with requests for additional resources, reallocations of existing resources, or flexible funding mechanisms to support data and evidence priorities. The Environmental Protection Agency requests funds for a new centralized evaluation unit, GSA requested additional resources to support the Chief Data Officer Council, and the Department of Labor requested flexibility in using funds for evaluation activities, to name a few. By identifying funding flexibilities and reallocations, agencies ease the burden on appropriators to identify new financial streams within top-line budget levels to support these initiatives, also suggesting increased likelihood of receiving the funding.
Buried in the details of agency budget requests are several proposals that suggest continued progress in implementing the unanimous, bipartisan recommendations of the U.S. Commission on Evidence-Based Policymaking. The Department of Commerce’s Bureau of Economic Analysis and Census Bureau propose to create a federal data service, similar to the National Secure Data Service, initially proposed by the Evidence Commission.
The Department of Health and Human Services proposes increasing access to wage and earnings data maintained in the National Directory of New Hires for targeted purposes, including to support improved government decision-making. Another proposal that is revived in the budget request is the idea to combine agencies that disseminate federal economic statistics. Collectively these proposals, among others, suggest a clear focus on maintaining bipartisan momentum for responsible data sharing across government.
The administration proposes to increase spending on artificial intelligence (AI) research and development activities in 2021, doubling current investments in preparation for building “industries of the future.” In practice, major investments that support AI also offer benefits to core data infrastructure and management across government agencies. New resources at the Departments of Agriculture, Defense, and Energy, could further promote rapid development of AI capabilities in government with benefits to policymakers, the economy, and the American public.
The budget proposal recognizes a detail often lost in dialogues about data capabilities in government: the federal workforce. Government workers need constant training and re-skilling to support emerging needs in data science, analytics, even privacy protections as technologies and methods evolve. To support the training and reskilling efforts, the budget includes a request for new funding to support diversity and highly-skilled workers for AI, data analysis, and other emerging needs.
There is always room for critique of the budget, and not all agencies provide clarity about the role of chief data officers, the value of evaluation, or even signal the prioritization of Evidence Act implementation. There are also challenges for the top-line messaging from the administration, including overall funding levels and removal of long-standing data, statistical, and performance chapters summarizing activities in the Analytical Perspectives Volume.
Importantly, the Trump budget proposal is a starting point for 2021 appropriations and Congress will still make numerous changes before final funding levels are established. Even for those that may disagree with the top-line spending levels or macro-policy choices, when it comes to the data and evidence priorities, there is much to applaud and support as the proposal is considered by Congress in coming months.
One year ago today, the Foundations for Evidence-Based Policymaking Act (Evidence Act) became law. Building on the recommendations of the U.S. Commission on Evidence-Based Policymaking, the Evidence Act, and complementary guidance from the White House Office of Management and Budget on federal data policy, work is well underway to create a growing appetite for research evidence within the federal government’s Executive Branch agencies. These efforts also aim to reduce some of the hurdles to the use of government data across agencies, between government agencies, and with the external research community, where much of the relevant evidence is generated.
I am Dean of a school populated by faculty who conduct policy research using such government data and teach students who will graduate and staff many federal agencies. How should our role, as external researchers and educators, be affected by implementation of the Evidence Act? One way that we can help with implementation is to focus on… implementation.
Executive agencies spend most of their time implementing policies legislated by Congress. This implementation process is wide-ranging – from decisions as visible and charged as changing fuel economy requirements for automobiles to nearly invisibly and apparently innocuous choices like how to arrange the paragraphs in an informational letter to welfare program recipients to promote effective use of benefits. Researchers outside government, however, often have access to information about decision-making and relevant data only at the legislative policy level. Their research, therefore, tends to evaluate the impact of an overall policy decision, rather than the nuances of its implementation.
One promising outgrowth of the Evidence Act is to narrow this mismatch. Agency learning agendas, mandated by the new law, can inform researchers of potentially compelling implementation choices at the agency level. Improved access to administrative data can give researchers the granularity of information needed to assess such decisions. As a corollary benefit, if external researchers focus attention in areas where agencies really need their insight, they can build partnerships with agency staff that enable them to use agency data to better inform analyses at the policy level as well.
The potential for this kind of synergy – simultaneously informing agency micro-implementation decisions and generating broad new knowledge – struck me in reading a recent study in my own area of health policy research. In 2015, some 6.1 million U.S. tax returns were subject to individual mandate penalties because some people on the return had failed to obtain health insurance. The Treasury Department was interested in learning about the best ways to design outreach letters encouraging those in this group to buy coverage in 2017 – a typical agency micro-implementation decision. In this case, though, there wasn’t enough funding available to conduct outreach to all eligible returns. Ithai Lurie and Janet McCubbin of the Treasury Department, together with Jacob Goldin, a Stanford Law Professor, designed an experiment in which they randomly assigned returns to one of several outreach letter formats or to a control group that did not receive an outreach letter at all. Then they used Treasury Department data on 2017 returns to see which of their outreach strategies worked best to encourage health insurance take-up in 2017.
But the team didn’t stop there. They merged the Internal Revenue Service (IRS) data with the Social Security death file, to see whether changes in the take-up of health insurance induced by their experiment affected mortality outcomes. That is, they built on their implementation research to evaluate the effects of the underlying program itself. Goldin, Lurie, and McCubbin found that the additional coverage induced by their more effective intervention letters actually reduced mortality among middle-aged adults over a two-year follow-up period.
The Goldin, Lurie, and McCubbin study is a great illustration of rigorous evidence development on implementation effectiveness. It’s also the first large-scale experimental study to show that health insurance reduces mortality. And it couldn’t have happened without collaboration between external researchers and agency staff, and without access to large-scale administrative data.
There are lessons in that collaboration for us as educators as well. We need to encourage our faculty to examine agency learning agendas, to put our insights to work where they are most needed. We need to educate our students — future agency staffers — about the value of data and about the array of approaches and perspectives available in the external research community, so that they actively welcome and seek out research partners.
Finally, as external researchers, we should take the same approach to the Evidence Act as we would to other important federal legislation – evaluate it! At this stage, the right focus is on evaluating the implementation choices made at the agency level, so that we develop an evidence base to make better use of evidence in policymaking.
As agencies proceed in implementing the Evidence Act during its second year, there is much promise for changing the relationship between the government and research community for the better. Success will require strong collaborations and partnerships, promising tremendous gains for producing and using meaningful evidence in coming years.
About the Author:
In 2013, Sherry Glied was named Dean of New York University’s Robert F. Wagner Graduate School of Public Service. From 1989-2013, she was Professor of Health Policy and Management at Columbia University’s Mailman School of Public Health. She was Chair of the Department of Health Policy and Management from 1998-2009. On June 22, 2010, Glied was confirmed by the U.S. Senate as Assistant Secretary for Planning and Evaluation at the Department of Health and Human Services, and served in that capacity from July 2010 through August 2012. She had previously served as Senior Economist for health care and labor market policy on the President’s Council of Economic Advisers in 1992-1993, under Presidents Bush and Clinton, and participated in the Clinton Health Care Task Force. She has been elected to the National Academy of Medicine, the National Academy of Social Insurance, and served as a member of the Commission on Evidence-Based Policymaking.
Glied’s principal areas of research are in health policy reform and mental health care policy. Her book on health care reform, Chronic Condition, was published by Harvard University Press in January 1998. Her book with Richard Frank, Better But Not Well: Mental Health Policy in the U.S. since 1950, was published by The Johns Hopkins University Press in 2006. She is co-editor, with Peter C. Smith, of The Oxford Handbook of Health Economics, which was published by the Oxford University Press in 2011.
Glied holds a B.A. in economics from Yale University, an M.A. in economics from the University of Toronto, and a Ph.D. in economics from Harvard University.
In the final weeks of 2019, Congress passed a significant piece of legislation, and the president signed the bipartisan bill into law Dec. 30, 2019: the Grant Reporting Efficiency and Agreements Transparency Act of 2019, or the GREAT Act, which will modernize federal grant reporting.
If your organization is one of the recipients or grantees of the more than $700 billion in annual financial assistance from the federal government, this will impact you. Chances are the GREAT Act will affect you and how your agency, institution, or research lab reports information back to the federal government.
We’ve been following the GREAT Act closely and are here to help answer your questions surrounding the legislation. Check out the Q&A of what you need to know now—and also what’s next for the GREAT Act, when the president signs it into law as expected.
Federal grants touch nearly every American through grant programs that provide free or reduced school lunches, equip our police and fire departments, build and maintain our highways and transportation infrastructure, support small businesses, and much, much more.
The GREAT Act is intended to:
“We named it the GREAT Act for a reason,” Rep. Virginia Foxx (R-N.C.) said when she introduced the bill in 2017. “The results of the passage will be great for stakeholders, government agencies, job creators, grantees, and grantors.”
It is designed to standardize a data structure for the information that recipients must report to federal agencies and to streamline the federal grant reporting process.
“Without updating the way we process grant reports, in many ways we might as well be still operating with a typewriter and a fax machine and Wite-Out,” Foxx said.
Based on my experience in implementing the DATA Act—the GREAT Act’s predecessor that requires financial data to be standardized, machine-readable, and transparent—here are three initial steps you can take to be proactive:
When taking these first steps, be sure to take an incremental approach. Let’s face it: change is hard. Do the easiest and smallest things first to make meaningful progress. Also, truly look at this as an opportunity to modernize and leverage the GREAT Act as a catalyst to push forward data and efficiency initiatives that may have stalled out in the past.
You probably won’t experience a great deal of change right away, as the act requires the White House Office of Management and Budget (OMB) and the designated “standard-setting agency”—likely the Department of Health and Human Services (HHS)—to develop data standards over the course of the next two years. Once the standards are set, federal grant-making agencies have one year to adopt the standards and issue guidance to recipients for all future information collection requests.
As mentioned above, you can take this time to implement new best practices that improve processes across the board, so you can not only be fully prepared for when the effective date arrives, but also take advantage of the new efficiencies in advance.
The law’s provisions require the new data standards be nonproprietary, machine-readable, and be in line with existing machine-readable data standards, including standards set under the Federal Funding Accountability and Transparency Act of 2006 used for USAspending.gov reporting.
If you are a grant recipient, what you report will likely not change substantially, but how you report will change in ways you might not have expected. More on those in the next section.
Some of the benefits to reporting organizations include a reduction in redundant reporting, potential for automated validation of a report’s completeness, automated delivery, and if you’re a high-performing organization, faster recognition of your excellent performance.
First benefit: Grant recipients may need to report less data. Recipients will be required to submit data in a machine-readable format, meaning that just sending PDFs will no longer be acceptable. Federal agencies will leverage technology to automate and streamline reporting processes, so recipients would report data only once, and federal agencies would use it multiple times.
Second benefit: Federal agencies can make more informed decisions when awarding grants. That’s a huge advantage if you are a recipient that consistently receives a clean audit opinion—chances are you will have greater access to federal grant dollars. The GREAT Act simplifies the current single audit reporting by requiring the gold mine of single audit data to be reported in an electronic format consistent with the data standards established under the act.
But if your single audit reports identify material weaknesses and repeat findings year after year, right now is a good time to get your house in order if you want to keep receiving federal funds.
Keep in mind this general timeline:
Our team at Workiva is committed to keeping you informed on the significant news related to the GREAT Act, whether you are a recipient, auditor, or federal grant-making agency. The Data Coalition, America’s premier voice on data policy, is another great resource for staying up-to-date. Check out the Data Coalition website, and subscribe to their newsletter.
Editor’s note: This blog has been updated and was originally published on December 20, 2019, at workiva.com.
In 2019, tremendous bipartisan efforts led to new federal data laws, progress on legislation that may become law in 2020, and gains inside the Executive Branch for implementing strategies to make government data more accessible and useable. Amidst a backdrop of hyper-partisanship and political turmoil in Washington, DC, the bipartisan approach for improving the federal government’s data policies cannot be understated.
Here are a few highlights from the past 12 months on areas of success for Data Coalition priorities:
Other Data Coalition priorities are also advancing as work proceeds to establish a federal data service, Congress considers the Taxpayer Right to Know Act developing program inventories, the National Security Commission on AI produces its final recommendations, the Advisory Committee on Data for Evidence Building is established and begins to deliberate, and as our country’s lawmakers consider reforms through the Select Committee on Modernization of Congress.
In short, the Data Coalition’s advocacy efforts paid off tremendously this year – and are continuing to accrue benefits – for the American people and our society. Through nearly 100 briefings on key policy priorities, the GovDATAx Summit, and countless other activities this year that bring together the corners of the data community, the Data Coalition’s members and expertise achieved real and lasting headway for the country.
In 2020 and beyond, as agencies work to implement the Evidence Act, the GREAT Act, and more, the Data Coalition members and staff will continue to serve as a resource to hold government accountable for effective implementation while also devising strategies for continuous improvement. The moment for more meaningfully transforming government data into a strategic asset is upon us and we hope the continued enthusiasm and support for the Data Coalition’s efforts will sustain this momentum in the coming years. More importantly, have confidence that in 2020 conversations about better using government data as an asset will continue to bring together Republicans and Democrats as we work to achieve common goals for improving society and meeting the needs of the American people.
The United States government is rapidly progressing in implementing new laws and guidance on evidence-based policymaking, data-driven government, and open data. During the Data Coalition’s GovDATAx Summit in 2019, Department of Commerce Deputy Secretary Karen Dunn Kelley suggested a metaphorical book on evidence-based policymaking for the U.S. is being written, with new chapters every year. In chapter 1 in 2017, the U.S. Commission on Evidence-Based Policymaking wrote a seminal report on better using government data. In Chapter 2 in 2018, Congress passed the Foundations for Evidence-Based Policymaking Act (Evidence Act), and in Chapter 3 the Executive Branch published its Federal Data Strategy.
What’s the next chapter?
Data Coalition members met with senior leadership at the Department of Commerce to discuss Chapter 4 leading into 2020 and beyond. In addition, groups like Project Evident, the Brookings Institution, and the University of Chicago’s Center for Impact Sciences are all developing plans for the next generation of evidence-based policymaking.
Key aspects and inputs for the next chapter will include:
Needless to say, there is much work to do to ensure our government increasingly adopts evidence-based and data-driven approaches.
What can those outside government do to support evidence-based policymaking?
As the next chapter of the evidence and data movement is written, stakeholders in non-profits, academia, and the private sector can all contribute. Action items could include:
As the federal government continues to improve data quality, access, and ease of use, the Data Coalition will continue to support its members in engaging in each of the action items as the next chapter is written, and beyond.
One of the largest data collection exercises undertaken by the United States government every 10 years is about to begin – the decennial census. In June 2019 the Data Coalition announced its formal partnership with the U.S. Census Bureau to help promote an accurate count, with support from the business community.
Data Coalition members met with U.S. Census Bureau Deputy Director Ron Jarmin to discuss the inflection points as the country prepares for the 2020 census, in addition to how the business community’s technology and data analytics firms can be more involved. Here are three key takeaways from the discussion:
#1: The benefits of a high-quality, accurate count are both clear and significant.
The country needs a high-quality census count to support a range of activities beyond allocating representation in Congress, from evidence-based policymaking in government to data-driven decision-making in the private sector. Decision-makers throughout government rely on census to allocate grant funds, determine the number of households in a region for assessing program needs, and also use the information to benchmark other surveys, such as the American Community Survey. Similarly, the business community relies on census data to determine where to locate new offices or retail establishments and to inform market insights.
An accurate count is also a difficult feat because some populations are especially difficult to reliably count. Populations include migrant communities, children under five, and homeless individuals, just to name a few. All of these populations are important to capture because as population dynamics shift, government and businesses must be able to respond accordingly to ensure public and private services are efficient, effective, and meet the expectations of the American people.
#2: Census privacy and confidentiality protections are strong.
While there has been much discourse over the past year about how certain data may be used in the contemporaneous environment to support decision-making, the Census Bureau data must be held in confidence.
The Census Bureau is only allowed to release summary statistics or information that does not include personal identifiers to the public. Any violation of this policy is also a violation of two separate laws – the Census Act and the Confidential Information Protection and Statistical Efficiency Act of 2018 – potentially carrying $250,000 in fines and up to 5 years imprisonment under each of the laws for each offense. Needless to say, the safeguards in law and throughout the Census Bureau’s professional staff are taken seriously and the American public should be assured that their confidential data will be strictly protected.
#3: Every organization can – and should – play a role in supporting an accurate count.
There are numerous tactics that businesses and non-profit organizations can use to support an accurate census count. But whether large or small, all organizations can play a role in supporting the census. Potential examples of activities could include promoting the census on websites, encouraging employees to respond to the census in 2020 with emails or even breakroom posters, all the way through targeted support services to meet particular needs.
Data Coalition member Esri published a resource in July 2019 explaining relevant methodological and technology tools for supporting the geospatial capabilities needed for the census. Another Data Coalition member, Tableau, is supporting the Census Bureau’s efforts to track response rates once the census begins, so that local community organizers can have efficient metrics to support their efforts. Deloitte Consulting offers a variety of IT and management support roles to encourage support efficient execution of the 2020 census. New member USAFacts is working to promote the new features of next year’s census. The Census Bureau continues to search for partners in the business community for the 2020 census.
Other data and technology partners are critical for supporting the census as social media and internet efforts have rapidly advanced over the past decade. The 2020 census will allow responses through the internet making responding easier than ever, but the risk of misinformation campaigns and the presence of cybersecurity threats are real. Technology and data companies can help support the census in reducing the risks for executing a census in the modern world.
The Data Coalition will continue to support the effort to ensure the 2020 census results in useful data for the American people.
Over the past two years, the prospect of the United States government and key decision-makers becoming more steeped in evidence-based policymaking has become increasingly bright.
On September 7, 2017, the U.S. Commission on Evidence-Based Policymaking (Evidence Commission) released its final report to the President and Congress with a strategy for better using the data that government already collects. The report contained 22 unanimous recommendations that focused on responsibly improving access to government data, strengthening privacy protections, and expanding the capacity to generate and use evidence.
Progress on Fulfilling the Commission’s Vision
While action has not yet been taken on all of the Evidence Commission’s recommendations, significant progress has occurred over the past two years. Here are some key highlights of what transpired in the last two years:
Next Steps on Fulfilling the Commission’s Vision
The Evidence Commission set the stage for monumental changes in government data laws, processes, and culture. Agencies have initiated wholesale overhauls of data management practices and the recommendations are quickly becoming reality.
But much work remains to fulfill the bipartisan vision outlined by the Evidence Commission – that government data are meaningfully analyzed to produce credible evidence that is actually used to inform policymaking. In the coming months and years, here are five areas for further attention:
Today, the Evidence Commission’s legacy can be celebrated as a substantial accomplishment in developing a sound and actionable strategy for better using government data. While more attention is needed to change government’s culture and operations to be more evidence-based, the early steps to better manage and use data are exceedingly promising.
Guest blog by Jane Wiseman, Institute for Excellence in Government.
As a fellow at the Harvard Kennedy School helping to support a national network of state chief data officers, I’ve had a front row seat to leading edge analytics for the past three years. As a more recent observer of chief data officers in the federal government, I’ve been impressed by the excellence and diversity of innovative data work being done by the pioneering data officers in federal service.
While reading the Federal Data Strategy Action Plan, I was inspired by how detailed and thoughtful it is. Actions address a range of important topics, and it’s wonderful to see data governance and data literacy getting attention, as these topics are not glamorous and sometimes get too little focus at the state and local level. There’s an aggressive timeline in the action plan and a remarkable amount has already been accomplished.
I was honored to share my thoughts at the Federal Data Forum hosted by the Data Coalition and the White House in early July, and was energized by the range of experts who shared their ideas. The simple fact that OMB is creating a data strategy for the federal government is one of the most exciting developments in data-driven government in my career and offers a tremendous opportunity to deliver more value for taxpayers.
Listening to other speakers during the forum, I was surprised twice – first at the common concern about the gap between the needed data literacy for government managers and current skill levels, and secondly by how few voices were calling for agencies to create and publish their data strategy.
Observation #1: The federal government needs to invest in data literacy.
Most leaders and decision-makers in government are not digital natives and many lack confidence in their data and digital skills. And this gap will slow the adoption of data-driven government.
Executives need to know how to ask for data analysis (what’s possible and what’s not), and how to critique results. They don’t need to be able to code or build an algorithm or map themselves, but they should know the power and capability of modern data tools. Basic data literacy means knowing how to ask good questions that inspire analysis, and then having the confidence to interpret and use the results. Let’s take a comparative study that was done between the US and 23 countries. In a comparison of the skills of the adult workforce in the US to those in 23 countries on "literacy, numeracy, and problem-solving skills," Japan and Finland led the pack on numeracy skills, while the United States ranked a disappointing 21 out of 23 participating countries. To close this achievement gap and to move toward large-scale organizational change to data-driven government will take a variety of training and coaching offerings. And it will take time.
Recommendation: The federal government should provide a full suite of data literacy training, support and coaching for senior leaders and management executives in government so that they have the confidence to lead data efforts.
Observation #2: Each agency should have their own data strategy and plan.
As the saying goes, “If we don’t know where we’re going we might end up somewhere else.”
In the Federal Data Strategy Action Plan, actions 12-16 call for parts of what would be agency-specific data strategies. But this call to action falls short of asking each agency head to publish a multi-year data strategy and to report annually on progress toward achieving that strategy. We need to know where each agency is going in order to achieve data-driven decision making at every level of the organization and across all bureaus and agencies. Without a long-term strategy, we can’t hope for permanent culture change.
The elements that exist in the action items — from data governance to open data to important data questions — need to be knit together into a multi-year plan for optimizing the use of data. Targets also must be set for each that can be reported on an annual basis in a cohesive integrated, department-wide plan.
With a clear charter from the chief executive, and armed with a strategy, a chief data officer can define a roadmap describing the difference the chief data officer team can make in government over a three- to five-year time horizon. The best chief data officers at federal, state and local levels operate from a strategy, a guiding document that sets mission and vision. They share their strategies and that helps them stay focused, and helps communicate to others what is and is not expected (e.g. chief data officers are not the one you call when you need someone to fix the printer and have an email server issue).
Agency heads must be the individuals ultimately responsible for their data strategy and must invest their chief data officers with the authority and resources to carry it out. It needs to be clear to all the chief executive has a strong relationship with and relies on data, and views the chief data officer as a trusted and important resource to the organization.
Strategy is about mission and outlining key questions. Asking good questions matters and clearly defining how the analytics research question will generate public value is important. The importance of understanding the connection to priority policy questions is summed up well by one of the CDOs I interviewed last year who said, “You might be Lord Algorithm but if you don’t stop to understand the problem, you will never succeed in making government better.”
When the UK created the Government Digital Service in 2011, their focus was on long-term transformation of government to digital services, and they’ve made consistent progress every year. But it didn’t all happen overnight. One of the keys was to make sure every agency had a point person and that he or she had a roadmap. We need that same level of focus on individual federal agency level strategies.
Recommendation: OMB should hold agency heads accountable on an annual basis for progress on achieving data-driven government, should require them to be the leader of their department’s multi-year strategic plan for data, and require the publication of their data strategy.
Successful Implementation Holds Promise for Lasting Impact
With this exciting momentum at the federal level toward becoming more data-driven in decision-making, there is a tremendous opportunity for the federal government to support the development of capacity in state and local government as well. For example:
The federal government has a unique opportunity at this moment to help incubate and advance the field with actions in every agency. Leadership and investment in capacity now will pay dividends long into the future.
The Federal Data Strategy, with an iterative and collaborative approach, should support agencies moving forward. If it’s done right, the strategy will lead to a major transformation of government.
Whether government should use data to improve society is no longer up for debate – the answer is definitively yes. When high-quality government information is accessible, it can be applied to generate insights that help decision-makers develop better policies for the American people. We’ve seen successes in early education, disease prevention, and government transparency efforts, and more could be done with better access to data.
For too long, our government operated in silos to address some of the issues related to data access and use, without planning for a wide range of data users. Too frequently, the data community reinforces its own silos rather than working in collaboration. That is why the Data Coalition is launching GovDATAx Summit in 2019.
Our inaugural GovDATAx Summit will be the beginning of a renewed dialogue about how we improve government’s data policies to truly unleash the power of data for the public good. The data community should work to empower innovators to generate insights that improve our economy and the quality of life for the American public. The conversation at GovDATAx is intended for an audience that:
During the Summit’s main sessions, experts from the White House, agency leadership, academia, and the private sector will discuss important new bipartisan policies that are being implemented this year, like the Evidence Act, which establishes new data leaders – chief data officers, evaluation officers, and statistical experts – across the federal government. GovDATAx will feature a discussion of important data standards that lead to better data quality, promote opportunities for public-private partnerships, and present exemplars about what works for using data as an asset.
Be a part of shaping the future of government data. Join the Data Coalition and hundreds of leaders from across the public sector, businesses, non-profits, and academia on Wednesday, October 30 to discuss the next steps for developing policies that unleash data for good.
The American public provides an incredible amount of information to the federal government – valued at $140 billion each year. This information is provided as part of businesses complying with regulations, individuals and firms paying taxes, and individuals applying for programs.
The development of the Executive Branch’s Federal Data Strategy is an effort to better organize and use all of that information to improve decision-making. On July 8, 2019, the Data Coalition joined the White House to co-sponsor a public forum gathering feedback on what actions the federal government will undertake over the next year to begin implementing the data strategy.
Kicking off the event, Dr. Kelvin Droegemeier, Director of the White House’s Office of Science and Technology Policy, stressed a key goal of the strategy is to “liberate” government data for society’s use, noting that nearly 90 percent of government data go unused. During the forum, 52 speakers – including 12 members of the Data Coalition – and more than 100 other experts provided input on how to improve the draft plan.
Here are fourtake-aways from the public comments provided during the forum:
#1. Leadership is Essential for Realizing Culture Change
New legislation enacted in early 2019 creates several new positions in government agencies, such as the chief data officers, evaluation officers, and statistical experts. Throughout the public forum, speakers stressed the need for these leaders to be empowered to institute changes within federal agencies, and informed about how to most effectively implement best practices for data access, management, and use.
Several speakers specifically stressed the critical role for newly established chief data officers in improving data quality and usefulness across government, in addition to providing improved training and tools for agencies to equip the federal workforce to use data. The concept of data literacy was also prominently featured, meaning that throughout federal agencies the workforce should be trained routinely about responsibilities and techniques for responsibly managing and using data.
#2. Targeted Data Standards Offer Opportunities for Efficiency
The need for improved data standards was discussed by speakers on more than half the panels during the event, with suggestions that the Federal Data Strategy could do more to encourage standards in the areas of financial regulatory reporting, agency spending and grant management, geospatial data, and organizational entities. For example, multiple speakers highlighted the potential opportunity to include more direction about the adoption of common business entity identifiers, like the globally-recognized Legal Entity Identifier (LEI) created, as a means of improving analytical capabilities while also reducing reporting burdens from regulated entities.
#3. Partnerships are an Important Element for Success
Many speakers noted their appreciation for the opportunity to provide feedback on the data strategy, and encouraged ongoing collaboration with those outside government through public-private partnerships. As the strategy is implemented over the next year, industry, non-profits, academics, and others in the American public should have opportunities to weigh in and hold agencies accountable for achieving the stated goals in the plan. Partnerships also offer agencies a specific means to coordinate with those outside of government to ensure the implemented policies and practices achieve meaningful improvements in the country’s data infrastructure.
#4. Coordination at OMB and Agencies is Key
Finally, because government data collected by one agency are often relevant for another, coordination is a critical component of success in the Federal Data Strategy. Speakers highlighted that the proposed OMB Data Council could serve as a model for agencies about how to work across interests, laws, and policy domains to achieve lasting change. But coordination must be achieved or promoted not just by OMB; it is also a responsibility for every agency to coordinate within its own programs and leaders to promote culture change, a data literate workforce, and to allocate resources to achieve the goals of the strategy.
In the coming months, the action plan will be finalized and publicly released, incorporating the comments from the Coalition-White House public forum along with other written feedback. The Data Coalition looks forward to continuing to partner with the federal government to ensure our national data policies truly make data an asset for the country.
To read the Data Coalition’s comments on the Strategy’s Draft Action Plan, click here.
1100 13TH STREET NORTHWEST SUITE 800WASHINGTON, DC, 20005, UNITED STATESINFO@DATAFOUNDATION.ORG
RETURN TO DATA FOUNDATION