Log in

Blog

<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 
  • September 22, 2022 9:18 AM | Jessica Yabsley (Administrator)

    We are proud to announce the recipients of the Data Coalition Initiative’s 2022 Datum Awards.

    This year's recipients are awarded a Datum Award for their role as innovators, leaders, and champions in the use of data in evidence-based decision-making, transparency, and accountability, to better our government and society.

    Nominees are individuals who are champions for the use of data in evidence-based decision-making, transparency, and accountability. Each of the nominated champions support and demonstrate a commitment to enabling a more efficient and effective government, while using data to make the government more innovative for improving outcomes that improve the lives of the American people, strengthen the economy, or make the country a better place.

    Datum Awards Sept. 2022

    2022 Recipients

    POLITICAL LEADERSHIP AWARD

    Senator Brian Schatz

    U.S. Senate

    With cybercrime continuing to harm vulnerable Americans, U.S. Senator Brian Schatz authored and ushered the Better Cybercrime Metrics Act of 2022 to passage. The bipartisan bill will give law enforcement a clearer picture of online crimes by improving data collection and requiring the FBI to integrate cybercrime incidents into its current reporting streams. Through Senator Schatz’s leadership, law enforcement and policymakers will have wider access to the data and tools they need to keep Americans safe.

    FEDERAL CHIEF DATA OFFICER AWARD

    Samuel C. “Chris” Haffer, Ph.D.

    Chief Data Officer, U.S. Equal Employment Opportunity Commission

    Samuel C. “Chris” Haffer, Ph.D. became the U.S. Equal Employment Opportunity Commission’s first Chief Data Officer in 2017 and immediately set to work modernizing the agency’s data practices. Among the many projects Dr. Haffer has led in the past year, the EEOC has launched EEOC Explore, an accessible data query and mapping tool, run nationwide surveys of employers and workers during the pandemic, and created the District Demographic Dashboard to help identify and better serve vulnerable workers.

    DATA INNOVATION AWARD

    Mike Willis

    Associate Director, Division of Economic and Risk Analysis, U.S. Securities and Exchange Commission

    Mike Willis is a longtime advocate for evidence-based decision-making, transparency, and accountability in the financial sector. As a leader in the U.S. Securities and Exchange Commission’s Division of Economic and Risk Analysis, Mr. Willis is leading efforts to require machine-readable financial disclosures, and oversaw the addition of downloadable, freely available data sets about securities holdings of large institutional investors to the SEC’s website.

    CUSTOMER EXPERIENCE AWARD

    Barbara Morton

    Deputy Chief Veterans Experience Officer, U.S. Department of Veterans Affairs

    Barbara Morton has overseen drastic changes in how U.S. veterans are served in her time with the U.S. Department of Veterans Affairs. She has built durable customer experience (CX) capabilities and capacities through her work. Thanks to her exemplary achievements, she is now responsible for sharing best practices in CX across federal agencies. Ms. Morton contributes to the Customer Experience Cookbook, which serves as a practitioner’s guide to building and sustaining customer experience capabilities in government.

    EVALUATION OFFICER AWARD

    Dr. Matthew Soldner

    Chief Evaluation Officer, U.S. Department Education

    As Chief Evaluation Officer of the U.S. Department of Education, applies extensive knowledge of mixed methods research to education evaluation, conducting rigorous tests to help states and local governments improve student outcomes. Dr. Soldner and his team have contributed important research on data sharing and the impacts of federally funded learning programs, and supported the What Works Clearinghouse, a resource for educators, evaluators, and researchers everywhere.

    STATISTICAL OFFICIAL AWARD

    Shelly Wilkie Martinez

    Senior Statistician, White House Office of Management and Budget

    Before becoming Senior Statistician at the White House Office of Management and Budget, Shelly Wilkie Martinez served as Executive Director of the U.S. Commission on Evidence-Based Policymaking, supported the Federal Advisory Committee on Data for Evidence Building, and much more. With a deep knowledge of the federal data ecosystem, Ms. Martinez has worked tirelessly in her role with OMB to fulfill the vision of the Evidence Commission and support compliance with the Evidence Act.


    Thank you to our sponsors


  • September 12, 2022 9:00 AM | Data Coalition Team (Administrator)

    The American public relies on good information to make decisions about what products to buy at the grocery store, for purchasing or renting a home, and for understanding what characteristics of a car meet individual needs. Investment decisions on Wall Street are no different – reliable and clear information is needed in the marketplace to make good financial decisions. Government too is no exception - reliable, high quality financial information allows decision makers to develop and executive on sound policies that benefit the economy. 

    Today, the ability for companies, government, and the American people to access data about regulated firms across the country is limited by an arcane regulatory reporting infrastructure. The Financial Data Transparency Act (FDTA) co-sponsored by Senators Mark Warner (D-VA), Mike Crapo (R-ID), and Chuck Grassley (R-IA)  takes on this problem by improving the reporting infrastructure for regulated firms to improve accountability. This bill has clear benefits for the American public, investors, government regulators, and private sector firms.

    Why do we need to modernize our financial data?

    These bills respond to recommendations provided by the U.S. Treasury Department in a 2017 report regarding reduction of regulatory overlap and duplication for banks and credit unions. The report, in response to an Executive Order on principles for regulating the financial system (E.O. 13772), calls for improved data sharing and reductions in reporting burdens and duplication. 

    The Financial Data Transparency Act addresses longstanding data deficiencies in regulatory reporting. The bill would require the seven of the financial regulatory member-agencies of the U.S. Financial Stability Oversight Council to adopt and apply uniform data standards (i.e. a common data format) for the information collected from regulated entities. As a consequence, the data standards will enable better information processing, software-enabled filing preparation, and data reconciliation. These features collectively are the basis for retail investors, regulators, and the market having better information for selecting investment opportunities and understanding risks. 

    How will the Financial Data Transparency Act help the government operate more efficiently?

    The Financial Data Transparency Act establishes a framework that can be used to improve regulatory reporting efficiency in coming years, reducing compliance overhead and the level of effort required for submitting financial reports. It also sets the stage for financial regulators to have access to higher quality data so they can spend their time focused on enforcement rather than tracking down inadvertent errors in reports. Streamlining regulatory reporting frees up valuable time and energy that can also support private sector innovation and productivity growth. 

    Why does the legislation require open data standards?

    The FDTA reiterates the requirement for disclosable public data assets to be made available as open Government Data assets, per the Foundations for Evidence-Based Policymaking Act. This assures the data assets published under the regulatory authorities of the FDTA’s covered agencies are presented in a manner consistent with existing government-wide data policy. 

    Open data standards allow forbetter information processing, software-enabled filing preparation, and data reconciliation. These features collectively are the basis for retail investors, regulators, and the market having better information for selecting investment opportunities and understanding risks. The FDTA includes a set of required characteristics which builds upon industry and technology best practices, accounts for lessons learned from existing federal regulatory standard setting, and incorporates relevant federal policy and international standards definitions. 

    The data will be made available under “open license” format which will reduce barriers for industry, academia, and others to incorporate or reuse the data standards and information definitions into systems and processes. This requirement will also facilitate competition among multiple vendors for creation, data collection, and analysis tools, which reduce long-term costs.

    Does requiring open data standards mean there will be universal reporting requirements?

    No. The FDTA does not impose new data collection or reporting requirements, and does not require agencies or regulated entities to collect or make publicly available additional information.  

    Will FDTA’s requirements mean that regulated entities need to adopt a specific software or other technology?

    No. Open data standards can actually help facilitate technological innovation by reducing barriers for industry, academia and others to create new tools, such as artificial intelligence and machine learning applications. FDTA does not impose any technological mandates and regulated entities would continue to independently choose their software and technological solutions. 

    Will the FDTA requirements be difficult for regulated entities to adopt?

    In order to reduce the burden on smaller regulated entities, the FDTA provides agencies with the flexibility to scale data reporting requirements in order to reduce burdens on smaller regulated entities and minimize disruptive changes to those affected by regulations. The data standards required by FDTA leverage existing, industry-accepted data formats and definitional standards.  The standards connect with existing accounting standards to allow regulated entities to leverage expertise and processes established by the accounting, audit, legal, and regulatory compliance workforce, without imposing undue new burdens or costs.

    How does the Financial Data Transparency Act relate to the Financial Transparency Act? 

    The Financial Transparency Act (H.R. 2989), cosponsored by Reps. Carolyn Maloney (D-NY) and Patrick McHenry (R-NC), is substantively very similar to the FDTA with common requirements, scopes, goals, and timelines. The bills differ in the process used to implement rules, with the FDTA establishing a joint rulemaking process for improved coordination across federal financial regulators. This coordinated process reduces the burden on regulated entities participating in regulatory processes for establishing standards and enables greater coordination – at the expense of the government – for engaging with stakeholders and responding to concerns that may be raised in future standard-setting activities. 

    The House-proposed FTA passed the U.S. House of Representatives with a strong bipartisan vote of 400-19 on October 25, 2021. It passed the House a second time in 2022 as part of the National Defense Authorization Act.

    Has the Data Coalition taken a position on the Financial Data Transparency Act?

    Representing members from the data analytics, technology, and management fields, the Data Coalition endorsed the current version of the Financial Data Transparency Act in the 117th Congress. The Data Coalition also previously endorsed the related Financial Transparency Act in the 114th, 115th, 116th, and 117th Congresses. These bipartisan pieces of legislation address the significant underlying data challenges that contribute to burdensome and ineffective financial regulation.

    What other organizations endorsed the Financial Data Transparency Act or the related Financial Transparency Act?

    Other transparency organizations and financial firms quickly endorsed versions of legislation:


      Last edited September 12, 2022


    • July 04, 2022 5:36 PM | Anonymous member (Administrator)

      The White House Equitable Data Working Group (EDWG) released A Vision for Equitable Data in April 2022. The report outlines the recommendations of the Working Group established by Executive Order 13985 in January 2021. The Working Group was tasked with identifying inadequacies and areas of improvement within federal data related to equity, and outlining a strategy for increasing data available for measuring equity and representing the diversity of the American people and their experiences. 

      The report is an important step toward ensuring equity is at the forefront of government policies and programs. Three priority uses for equitable data were presented in the report:

      1. Generating disaggregated statistical estimates to characterize experiences of historically underserved groups, 

      2. increasing access to disaggregated data for the evidence-building, and 

      3. conducting equity assessments of federal programs. 

      The Data Coalition supports these priorities, along with the report findings and recommendations – particularly the emphasis on the collection and disaggregation of demographic data as well as the suggestion that agencies work with federal statistical agencies to incorporate and protect demographic data. In July 2022, the Data Coalition Initiative met with staff from the House Oversight and Government Reform Committee (HOGR) at their request to discuss the EDWG report, gaps in the report, and potential areas for Congressional and Administrative support. 

      There are opportunities to further strengthen an environment that will support a more equitable data system. Data Coalition members used the time with the HOGR Committee to offer insight into certain aspects of improving data quality and designing an equitable and inclusive process, from designing data collection plans to ensuring data are useful for end users across communities. Data Coalition members’ comments fell into four general categories: fostering trust by inclusive engagement, developing and using data standards, increasing accessibility, and bolstering accountability.

      Opportunities to Strengthen the EDWG Report

      Beyond engaging all levels of government and the research community as highlighted in the report, it is crucial to have meaningful stakeholder engagement, especially with those communities affected by data collecting. Meaningful engagement fosters a sense of trust as well as prevents data collection and use causing unanticipated harm to people. This trust is essential to ensuring that the data collected from individuals and communities are high quality and relevant to policies that aim to help those same communities. 

      Data standards are also necessary to ensure a more aligned, equitable federal data ecosystem. Developing consensus standards needs to be done in collaboration with communities of practice, data contributors, and potentially impacted communities. Though agencies may have the authority to identify preferred standards, they lack the authority to adopt standards. Adopting consensus standards among agencies can provide more clarity for those collecting and using government data. Similarly, consensus data standards can facilitate data sharing amongst agencies – reducing burden on both government and taxpayers. 

      The report discusses the need for increasing accessibility by making data more understandable and useful. To address this, the government can bolster human capital to support data capacity on the government level and address capacity to use data in order to engage those contributing data. Additionally, agencies and other stakeholders should demonstrate the value of data in addressing underserved community needs. 

      Provision of data access tools and developing usable online data portals are ways to tackle accessibility as well as enhance transparency and accountability. The EDWG approaches accountability and transparency from a taxpayer lens – but in terms of equitable data use and impact, there needs to also be a requirement to tell the story of the data impact, putting more emphasis on the benefit for a community for using this data, in turn incentivizing continued data contribution.  

      Legislative & Administrative Recommendations

      With this in mind, the Data Coalition offers the following recommendations for how to further the EDWG’s efforts: 

      • Leverage existing provisions from the Foundations for Evidence-Based Policymaking Act of 2018 to improve collection, management, and use of data 

      • Pass the National Secure Data Service Act (H.R. 3133) 

      • Fund federal, state, and local government to adopt and modernize data systems

      • Develop legislation to adopt consensus-based data standards, including requiring collection of data that informs agency equity assessments using a uniform standard that can be adopted at all levels of government

      • Build additional funding flexibilities into the grantmaking process to enable agencies to direct grants toward building capacity without needing additional funds

      The EDWG report includes critical steps to make data more equitable and the HOGR Committee’s interest in gathering stakeholder input to understand ways to bolster the report’s recommendations is encouraging. In addition to Congressional and Administrative opportunities, continuing to engage in discussions with data and equity experts as well as the communities providing and using the data, and leveraging existing authorities and expertise with the government will all contribute to facilitating a more equitable data system. 



    • June 01, 2022 9:00 AM | Data Coalition Team (Administrator)


      President Biden signed the bipartisan Courthouse Ethics and Transparency Act into law (P.L. 117-125) in May 2022. The new law requires that all judicial financial disclosures be "full-text searchable, sortable, and downloadable format for access by the public." The law aims to make court financial data open and accessible to the public, enhancing transparency and accountability in the Judicial branch of the federal government.

      Federal judges are required to recuse themselves from cases where there may be a financial conflict of interest, including if family members have a financial interest in a case. However, the current oversight process to ensure judges are taking the necessary steps to be impartial is lengthy and complicated.

      Currently, federal judges are subject to publicly reporting securities transactions – such as the purchase or sale of stocks, bonds, commodities futures, and other forms of securities – however, these reports have only a six-year lifespan before they can be destroyed and are difficult for the public to access. Even if an interested party requested and collected judicial disclosure reports, they may not be useful if the data are not searchable. 

      The Courthouse Ethics and Transparency Act, first introduced in the 117th Congress in October 2021 by Senator John Cornyn (R-TX), lays out two main provisions to address the current oversight system. First, federal judicial officials must file a report for securities transactions over $1,000 within 45 days, in line with requirements for top officials in the Executive and Legislative  branches of government, including the President, Members of Congress, and President- and Senate-appointed officials. Second, the U.S. Office of the Courts must make these disclosures publicly available on a searchable internet database no more than 90 days after the report is filed.

      The establishment and maintenance  of a searchable, sortable, and downloadable format database is key to removing the unnecessary limitations to accessibility of statutorily-mandated financial disclosures, facilitating a collection of data that are useful and timely. By having this data more easily accessible and available for oversight, the Courthouse Ethics and Transparency Act can help strengthen public confidence in the integrity of the federal judicial system and improve overall trust in the government as a whole.

      The Data Coalition Initiative advocates for policies that ensure government data are high-quality, accessible, and usable, and we applaud all those who have been working to advance the The Courthouse Ethics and Transparency Act. This is another important step toward all branches of the federal government utilizing data that can ensure they operate in a transparent and effective manner.


    • March 30, 2022 9:00 AM | Data Coalition Team (Administrator)

      After a near two-year pause on in-person convenings, the Data Coalition Initiative hosted the AI Public Forum x RegTech22 Data Summit, sponsored by Donnelley Financial Solutions (DFIN) , as a hybrid event, with participants both in Washington D.C. and online. The event aimed to facilitate an environment to build a strong national data community and advocates for responsible policies to make government data high-quality, accessible, and usable – a central goal of the Data Coalition. 

      The day began with a public forum on “Accelerating AI in the Public Sector,” where over 20 representatives from industry, academia, non-profits, state and local governments, and the general public shared perspectives and recommendations for how to best use AI in public sector regulation. The afternoon program, RegTech22 Data Summit, featured discussions focused on the intersection of regulatory technology (RegTech) and improving customer experience (CX). 

      The RegTech22 Data Summit was designed with two goals in mind:

      1. Highlight areas where regulatory technologies improved processes and reduced burdens for data and information collection through regulatory processes; and 
      2. Showcase how innovations in data management and information usability have improved customer experiences. 

      The Summit offered a wide range of experts from agencies and organizations, all representing different “customers” with varying missions and needs. Innovative solutions, such as data visualizations and collaborative databases to standardization of data and entity identifiers, demonstrated how data needs, information access and dissemination, customers, and collaboration supported the idea that there are solutions to celebrate and clear avenues where agencies can refine efforts.

      The Summit program kicked off with a panel featuring Chief Data Officer (CDO) from the Commodity Futures Trading Commission (CFTC), Tammy Roust, illustrated the breadth of the commission’s customers by noting that we cannot buy a box of cereal without interacting with the work of the CFTC. Ultimately, she noted, “we serve [American taxpayers] by ensuring that the futures and derivatives markets are fair and transparent and have integrity.” Responding to the White House executive order to prioritize CX, the CFTC is building data visualization tools and open data portals under the guidance of the CDO office, making publicly available data more accessible while maintaining data privacies that are specific to the commission’s data feeds. 

      Considering different “customers,” co-panelist Adam Scott, Director of Design and Development at the Consumer Financial Protection Bureau (CFPB), worked with the bureau’s Chief Technologist, Erie Meyer, to identify what information might be most valuable for the general public who visited the CFPB website. Maximizing response time, Adam and Erie engaged with design sprints. In those sprints they looked at broad problems, worked with internal experts to understand the problem space, hypothesize solutions, design prototypes, and test with real users in a short timeframe. The new approach allows the CFPB to respond rapidly, and as Adam summarized, “mak[e] sure that we’re constantly evolving and updating our site to meet the most critical needs of our users.” Highlighting how data access is also important for effective CX, Adam discussed how his team maintains the CFPB public data inventory. After identifying a distinct pain point that users did not have a means to discover all of the CFPB’s public data sets, CFPB built the simple webpage. 

      In the Summit’s second panel on improving climate data, CDO of the Federal Energy Regulatory Commission (FERC) Kirsten Dalboe emphasized the role of agency CDOs as the ones responsible for overseeing proper data governance and use throughout the data lifecycle. “Effective data governance and a managed data lifecycle designates trusted data sources. It maintains a comprehensive data inventory. It defines data standards, determines policies and strategies for promoting enterprise data management activities, promotes data stewardship to facilitate data sharing, collaboration, and data quality,” explained Dalboe. Proper data governance allows agencies and organizations to know what they have and how it can be used to carry out their missions. 

      Being able to correctly identify data needs, uses, and gaps are fundamental to building and sustaining the proper use of data, whether an agency or organization interfaces with the general public, regulators, academics, international organizations, investors, or utilities. Panelist Kristy Howell, Senior Economist at the International Monetary Fund (IMF), discussed the G20 Data Gaps Initiative, an IMF project that started after the global financial crisis with the goal to identify data gaps and how the IMF can work with the G20 economies to improve data. According to Howell, the data gaps project is expanding to include reviews on climate impact and macroeconomic data. Additionally, the IMF developed the Climate Change Indicators Dashboard, a truly collaborating database created to address the growing need for more information on climate and to assess how climate impacts the macro economy and how the economy is impacting the environment. As Howell and SAP Vice President and Head of U.S. Government Relations Kevin Richards stated multiple times – “you can’t manage what you can’t measure.” 

      In a theme seen throughout the Summit, Howell continued, “data access and data sharing are important pillars” to collaborative problem solving efforts. Similar to remarks made by Dalboe, a common understanding of data elements and vocabulary – data standards – can help drive innovations and will support data-informed decision making. “Standards facilitate automation and streamlining of processes, which results in lower costs and more effective analysis,” said Mike Willis, Chairperson of the Regulatory Oversight Council (ROC) and Associate Director in Division of Economic and Risk Analysis at the Securities and Exchange Commission (SEC). 

      Highlighting one area of improvement for data standards, Willis noted that federal agencies have over 50 legal entity identifiers unique to the agency. According to Willis, a standardized identifier like the Legal Entity Identifier (LEI), derivative identifier, and the Financial Instrument Global Identifier (FIGI), used for financial instruments, could connect information across regulatory agencies and international borders, simplifying compliance processes, bolstering Know-Your-Customer (KYC) requirements, and even enhance Zero Trust Architecture, which uses digital identities. In the case of Zero Trust Architecture, a structure required for federal agencies to adopt following an executive order on cybersecurity issued in 2021, the LEI offers that international standardized unambiguous identifier for legal entities issuing digital credentials – validating the digital credential and expanding traceability. Resulting in reduced costs and compliance burdens, the regulators and regulated entities would benefit from the use of standardized data and standardized entity identifiers.  

      Expounding on how government is using technology to improve CX, Hugh Halpern, Director of the Government Publishing Office (GPO), began his keynote address by reminding the audience that GPO is responsible for producing and providing print and publishing services to all three branches of government, manufacturing everything from the federal register, congressional record, congressional bills, U.S. passport, and much more. Focusing on congressional documents and the GPO’s design evolution over the past 200 years, Director Halpern explained how the technology morphed from handset type to digital text editing, and while the design and accessibility of the documents adapted with available innovations they were not always user friendly. The introduction of USLM XML schema laid the digital foundation for exponential improvements of legislative documents. USLM feeds XML Publishing (XPub) and allows legislators and staff to more seamlessly track changes, compare document versions, and use the document for additional content development, “just like a normal text file!” While there are still areas of improvement, solutions like USLM have the potential to digest and smooth out issues related to unstructured data and text files that confuse and gum up communication lines across the legislative branch. Just like the work of the GPO, the future use cases for such technologies are not limited to one branch of government. 

      The White House executive order on CX – Transforming Federal Customer Experience and Service to Rebuild Trust in Government – prioritizes how agencies engage with the American people, including lines of feedback. Speaking on the panel topic of “Improving Citizen Engagement and Transparency in Rulemaking Using RegTech Solutions,” Reeve Bull, Research Director at the Administrative Conference of the United States (ACUS); Virginia Huth, Deputy Assistant Commissioner, Federal Acquisition Service Office of Technology Transformation, General Services Administration (GSA); Kirsten Gullikson, Director, Systems Analysis and Quality Assurance at Office of the Clerk, U.S. House of Representatives; and Susan Dudley, Former Administrator of the Office of Information and Regulatory Affairs, White House Office of Management and Budget, each emphasized the importance of engaging with the general public with an effort to support transparency and to expand trust. 

      The panelists explained how technology has the potential to help with this, but in regard to the rulemaking process, it must be applied in the right context. Dudley and Bull looked at potential areas of improvement in the e-rulemaking process, including the application of agile methods that would incorporate constant feedback loops for improvement. While not yet widely used, this approach has potential to increase public engagement and improve rollouts. To elaborate further, Huth offered a few examples of innovations she managed at GSA’s eRulemaking Program Management Office, referring to a proof of concept that combines machine-readable language, no code software, and natural language processing. Bringing together the “trifecta,” Huth explained, allowed GSA to build a technological approach that “enables machine-readable tasks that give us the ability to provide context,” – something that is particularly useful when helping regulators sift through public comments and review regulations, identifying out of date topics and contradictory regulations.  

      Changes in government processes, data governance, and the strategic use of technologies will take time. As CFTC CDO Tammy Roust remarked, the fact that government does not match private sector pace of change is not necessarily a bad thing. Fellow panelist Darren Wray,  Co-founder of Guardum, a DFIN company, expanded on the point by noting that if something breaks in government transformation, the consequences can be catastrophic. Nonetheless, private-public collaborations and conversations like the ones hosted during the RegTech22 Data Summit provide a space for the exchange of ideas and perhaps insight into how available management approaches and technologies can aid the efforts made by government to more effectively interact with the public.  

      The data, information, programs, and regulatory technologies discussed during the RegTech22 Data Summit are but a few examples of how efforts to increase transparency, accountability, and public trust in government are in motion. There is no one approach that will guarantee that public trust in government will improve, but the examples discussed throughout the day shed light on efforts across government and the private sector to improve data availability, usability, readability, and accessibility.  

      The Data Coalition was founded 10 years ago with the mission to improve government transparency through the passage of the Digital Accountability and Transparency (DATA) Act, which became law in 2014. Since then, the Data Coalition Initiative and allies have advocated for data laws to change the way government tracks, manages, uses, and shares data. Reliable, quality, and accessible data feed the technologies discussed during the RegTech22 Data Summit. Learning how to work with the data produced, ingested, and circulated by the government remains an ongoing challenge and the Data Coalition Initiative will continue to  advocate for responsible policies to make government data high-quality, accessible, and usable.


    • January 21, 2022 9:00 AM | Data Coalition Team (Administrator)

      The Data Coalition’s Environmental, Social, and Governance (ESG) Data Working Group released its final report this week sharing recommendations to help bring ESG data to a place it could be effectively used in a regulatory environment. The group calls for ESG data to be standardized, machine-readable, and auditable. If the U.S. Federal government was to implement the working group’s recommendations, it would be positioned to take the lead on developing standards and expectations before international standards are in place. 

      ESG data in an entity’s financial reports are used by stakeholders to assess risk and long-term value on metrics impacting governance, people, profits, and the planet, increasingly influencing international investment decisions. Currently, ESG reporting is largely voluntary and reporting requirements are determined by investment firms and the market, creating an environment of confusing and uncomparable ESG disclosure. According to the working group, this means that “understanding the information provided across firms, sectors, regions, or countries is not only challenging, but largely unreliable and uncertain.”  

      As calls for ESG reporting grow in the U.S. and internationally, the Data Coalition recognized an opportunity to support useful ESG reporting. ​​The Data Coalition ESG Working Group convened in 2021 and included representatives from member companies DFIN, Workiva, Summit LLC, the Global LEI Foundation, SAP, XBRL US, and Data Foundation staff. With the goal of producing guidance to inform data-related legislation, financial regulations, and private sector practices, the group came together throughout the year to develop an understanding of the current landscape of ESG reporting and determine the most critical improvements needed. In June 2021, the group provided public feedback on a climate disclosure inquiry from the Securities and Exchange Commission and released their final report in December of 2021. 

      • First, the U.S. should contribute to consensus international, domain standards to the extent feasible. Existing independent standard-setting organizations that take into account national, regional, sector, and market perspectives provide a platform for producing standards that are usable and consistent with global activities. 
      • Second, all ESG data disclosures should be machine-readable and digital so that it facilitates interoperability. Machine-readable data allows more efficient and cost-effective growth across the industry. 
      • Lastly, ESG data should lend itself to be auditable to support greater accountability and transparency. A glossary of terms and data sources, and establishing processes similar to disclosure controls with financial reporting, can increase the confidence of investors and the public that ESG data is trustworthy and of high quality.

      Market pressure for ESG disclosure is increasing and the U.S. should take advantage of the momentum of interest in ESG. The Data Coalition supports the working group’s recommendations that U.S. regulators, by cultivating a data ecosystem composed of standardized, machine-readable, and auditable data, can facilitate a more trustworthy and influential ESG reporting ecosystem. 


      Download the full recommendations 


    • December 02, 2021 9:00 AM | Data Coalition Team (Administrator)

      The former Co-Chairs of the U.S. Commission on Evidence-Based Policymaking Katharine G. Abraham and Ron Haskins former Co-Chairs of the Commission on Evidence-Based Policymaking wrote to the U.S. Senate Committee on Commerce, Science, and Transportation and U.S. House Representatives Committee on Science, Space, and Technology to strongly encourage the respective committees to include the NSDS Act in the conferenced version of the United States Innovation and Competition Act (USICA).

      We are the former Co-Chairs of the U.S. Commission on Evidence-Based Policymaking, appointed respectively by then-President Barack Obama and then-House Speaker Paul Ryan. The Commission members included fifteen politically-appointed data and privacy experts. Our report, released in September 2017, contained a set of findings and recommendations endorsed by all of the Commission’s members for more effectively leveraging existing government data assets while also enhancing privacy protections. [1]

      In 2018, Congress acted to implement about half of the Commission’s recommendations by passing the Foundations for Evidence-Based Policymaking Act (P.L. 115-435). The experience since that time has made clear  the benefits for government transparency and accountability of implementing the Evidence Act’s provisions.  Thanks to newly appointed Chief Data Officers and the increased emphasis on program evaluation as a core function of government, among other changes, the government has made significant progress towards more effectively using the data it holds. 

      While the Evidence Act put many of our recommendations into law, other important recommendations remain for the Congress to address. One that we believe is a critical priority is the establishment of a National Secure Data Service (NSDS). Earlier this year, the House of Representatives approved the bipartisan NSDS Act as part of the National Science Foundation reauthorization. As the former leaders of the Evidence Commission, we endorse the NSDS Act as a means to address one of our most important recommendations, establishing a secure data linkage resource that can facilitate addressing broad policy and research questions while protecting privacy and confidentiality. We strongly encourage Congress to include the NSDS Act in the conferenced version of the United States Innovation and Competition Act (USICA).

      The NSDS Act was based on the Evidence Commission’s 2017 report, research from the National Academy of Sciences, and subsequent research carried out by a Commission member and senior staff member[2]. Many of the original Commissioners also provided direct feedback on and agreed with that subsequent proposal. In May 2021, NSF’s Assistant Director for Social, Behavioral, and Economic Sciences said in a public statement that NSF is already beginning efforts to establish the infrastructure to operate the data service [3]. More recently, the Federal Advisory Committee on Data for Evidence Building established by the Evidence Act released its interim recommendations calling for the creation of a data service and presented an implementation framework that aligns with the NSDS Act.

      The establishment of an NSDS would substantially contribute to strengthening the analytical capabilities of the Federal Statistical System, provide a resource and expanded capacity for conducting program evaluation, and, at a practical level, create an infrastructure for more rapid responses to congressional policy inquiries about outcomes and programmatic impacts. This can be achieved within the strong privacy framework laid out in the Confidential Information Protection and Statistical Efficiency Act (CIPSEA) reauthorized by the Congress in 2018, consistent with what is proposed in the NSDS Act. 

      In our view, the National Secure Data Service is a necessary part of modernizing our country’s data infrastructure. We encourage the Congress to include this bipartisan proposal in the conference agreement for the House NSF reauthorization bill and the Senate USICA. We are happy to speak with you or your staff on any questions about the Evidence Commission’s recommendations or the NSDS Act. 


      Download the Letter Here


      1. U.S. Commission on Evidence-Based Policymaking. (2017). The Promise of Evidence-Based Policymaking: Report of the Commission on Evidence-Based Policymaking. Washington, D.C.: GPO. Available at: https://www.datafoundation.org/s/Report-Commission-on-Evidence-Based-Policymaking.pdf
      2.  Hart, N. and N. Potok. (2020). Modernizing U.S. Data Infrastructure: Implementing a National Secure Data Service to Improve Statistics and Evidence Building. Washington, D.C.: Data Foundation. Available at: https://www.datafoundation.org/
        modernizing-us-data-infrastructure-2020
        .
      3.  Lupia, A. (2021). Letter to the Data Foundation RE National Secure Data Service. National Science Foundation. Available at: http://www.datacoalition.org/wp-content/uploads/2021/11/NSF-Response-to-Data-Foundation-Inquiry-Lupia-May-18-2021-FINAL-1.pdf.

    • November 16, 2021 9:00 AM | Data Coalition Team (Administrator)

      Earlier this year, the Chief Data Officers Council requested public feedback on how they can continue to improve the government’s management, use, protection, dissemination and generation of data in government’s decision-making and operations.

      In response, the Data Coalition hosted a virtual public forum to create an opportunity for the data community to offer feedback, recommendations, and advice to the federal CDO Council. As a result of that forum, as well as research informed by the Data Foundation's CDO Insights Survey, we offered the following 12 recommendations to the CDO Council.  Our full comments are available here.

      • Recommendation 1 – CDOs should work with their agency CFO and OMB to increase CDO funding flexibilities and direct resources.  Most CDOs do not have adequate resources to fulfill their statutory responsibilities and support agency missions. CDOs need sustained, predictable, and adequate resources to implement data priorities. Congress should authorize CDOs to use additional funding flexibilities and set-aside authorities, as well as provide increased direct appropriations for CDOs to succeed. This longer-term resourcing plan aligns with the congressional intent in establishing the CDO role through the Evidence Act, which created the position indefinitely rather than for a short-term period. 
      • Recommendation 2 – CDOs should work with OMB to clarify responsibilities and expectations. While CDOs are operating with their peer community of practice and under the general framework of the Evidence Act and the Federal Data Strategy, additional guidance from OMB can help align emerging priorities from the administration with the activities implemented by CDOs. In addition, CDOs will benefit from  clearer expectations on reporting requirements, including how to address required due dates and expectations about what should be reported to OMB, Congress, and the American people. Additional guidance could also include more tactical direction about what steps CDOs should take, how to  prioritize the steps, and areas for interagency cooperation and collaboration. 
      • Recommendation 3 – Congress should remove the statutory sunset of the CDO Council. Currently the CDO Council is scheduled by law to sunset in 2025. It has proven itself to be a valuable coordinating body and community of practice for CDOs. The CDO Council provides vital technical assistance and a valuable community of practice to convene and share knowledge. Since most CDOs and their offices are relatively new to the role and responsibilities, additional support from the CDO Council and peers in the form of technical assistance, resources for strategic planning, and other planning processes can support the entire CDO community, including for CDOs operating with limited staff and capacity. With the expectation that CDO roles continue indefinitely, the coordination of the CDO Council should as well. The CDO Council or OMB should include this request in an appropriate forum, such as the package of FY 2023 President’s Budget legislative proposals.

      • Recommendation 4: The CDO Council should work to create an ecosystem of data-literate and data-fluent workers. The need for more staff capacity was the top request articulated by CDOs in the Data Foundation’s CDO Insights Survey. CDOs did not just request FTEs, but rather to add specific highly-skilled data scientists, data architects, and data engineers required to successfully carry out data governance and management activities. One cross-agency effort that was viewed positively by participants was the January 2021 joint hiring initiative coordinated by the Office of Personnel Management. Ten agencies joined together to put out the call to hire 50 senior data scientists. In addition to getting high-level expertise into CDO offices, it is important that there are base levels of data literacy throughout the workforce in order to support a culture of data. CDOs should create a shared framework for data skills needed to support their agencies as well as definitions for various roles throughout their agencies and the types of skills required.
      • Recommendation 5: CDOs should emphasize their role as designated leaders to promote training and data fluency among staff of departments. Commitment from agency leadership to establish a strong data culture in agencies is critical for a coordinated training and retention strategy. This should include identifying current gaps in skills, capitalizing on existing training programs and models, and developing training programs when necessary. Specific limitations of privacy frameworks in which agencies are operating should also be addressed as part of training.
      • Recommendation 6: The CDO Council should work with the OMB to ensure that forthcoming implementation guidance to agencies on data inventories prioritizes machine readability and interoperability. Implementing and updating the metadata necessary for data inventories across federal agencies can be an intense process, representing a significant workload. In order to deploy automation technologies that reduce workload, as well as improve the quality of data inventories and the quality of aggregating services like data.gov, the metadata standards associated with these inventories should be machine readable and interoperable. Machine readable, interoperable metadata supports easier discovery and use of data, especially as the number of data sets within data inventories continues to grow. In addition to ease of discovery, machine readability allows for the automation of several processes that can help reduce burden on custodians of data inventories.
      • Recommendation 7: CDOs should focus on data sharing standards that facilitate interoperability, data linkage, and privacy. Standardization and creation of data standards are emphasized in the recently released Advisory Committee on Data for Evidence Building Year 1 report. The report provides a review of the state of data for evidence building in the federal government, particularly on opportunities for secure data sharing. There are a number of examples of entities that securely aggregate, integrate, and share information. We strongly urge the CDO Council to align its work with the efforts undertaken by the Advisory Committee. Additionally, data sharing approaches must prioritize application of robust confidentiality and privacy safeguards. Various tiers of access and pilot projects testing the use of privacy-enhancing data linkage and multiparty computation offered limited preliminary evidence about potential promise for such approaches, however, further investment into privacy-enhancing technologies is needed in government prior to operating at scale. 
      • Recommendation 8: Publicly accessible data must be prioritized. Ensuring that government data are easily accessible and usable aligned with efforts to promote transparency and accountability for government. Accessibility also facilitates collaboration with researchers, the private sector, and other levels of government, which can lead to more efficiency and innovation in public service.
      • Recommendation 9CDOs can improve communication about how they demonstrate the value of using data. Where possible, in coordination with the Evaluation Officer and the Statistical Official appointed under the Evidence Act, CDOs should engage in deliberate steps to provide metrics, summaries, and, when possible, evaluations that highlight the impact and cost savings of their efforts. To gain support within their organizations, CDOs need to show their leadership strategically, including valuable accomplishments that improve the ability for other staff to better perform the roles. In so doing, CDOs may help build a more compelling case about the need for resources to create and grow the staff and gain leadership buy-in within their agency. If the CDO is able to show programmatic savings in time and/ or dollars caused by their activities, they establish a base for justifying the use of existing resources and requesting greater resources in the future. Even small wins are vital to building support. CDOs also benefit in helping to manage organizational change, encourage data literacy, and increase the influence of evidence-informed decision making.
      • Recommendation 10: CDOs should conduct regular maturity assessments to accurately gauge existing data capacity and needs. Maturity assessments like the one required by the Evidence Act should be a continuous process rather than simply for compliance. Understanding the day-to-day operational needs for data and data skills will allow CDOs to effectively direct resources and training to areas within their agencies that may be in most need of support. By measuring levels of data literacy, use of data, and other aspects at the project level, CDOs can ensure that they are facilitating the growth of a strong data culture within their agency.
      • Recommendation 11:  The CDO Council should create a permanent data ethics working group to ensure the Data Ethics framework continuously meets emerging needs, to provide resources and guidance to agencies, and to partner with relevant professional associations for ongoing education and training on data ethicsThere is a need for clear, unified guidance from the CDO Council in regard to ethics and equity standards for data. Existing frameworks, such as the Federal Data Strategy ethics framework, provides guidance for developing a single standard going forward, but the CDO Council should collaborate with ethics-focused organizations outside of government to encourage  application of best practices and continuous improvement to those practices.
      • Recommendation 12: The CDO Council should work with CIOs to facilitate the adoption of appropriate modernized technology. Data collection, management, and analysis present unique challenges and needs for technology, such as automation to optimize data collection and tools that can streamline data collection, analysis, and storage. When adopting new technology to help support data functions, we encourage the CDO Council to partner with CIOs and other relevant stakeholders to leverage existing technologies, where possible, in order to avoid “reinventing the wheel.”

      The success of CDOs in the federal government hinges on their ability to perform expected and critical tasks. If they are successful, government data can be an asset, creating a robust data infrastructure that will serve a variety of purposes, including improving operational decision-making and evidence-based policymaking capabilities. While there are challenges, the progress of CDOs over the past year is commendable. We hope to continue a productive working relationship and dialogue with the Council going forward and are happy to respond to any questions you may have regarding these recommendations.


    • November 05, 2021 9:00 AM | Data Coalition Team (Administrator)

      Author: Amanda Hejna, Data Foundation Fellow, and Senior Associate with Grant Thornton Public Sector

      The Advisory Committee on Data for Evidence Building (ACDEB) was formed over a year ago to provide recommendations to the White House Office of Management and Budget (OMB) on how agencies can better use data to build evidence and improve decision making across the federal government and beyond. Composed of data experts from all levels of government and the public sector, the Committee was charged with forming a foundational understanding of the current state of and future needs for the use of data for evidence building and in doing so fulfill the spirit and vision of the Evidence Act. 

      Throughout the first year, the Committee focused particularly on developing a vision and framework for National Secure Data Service that would connect data users at all levels of government and the public and establish a unified evidence-building system across the federal government. At the culmination of Year 1, the Committee presented seven high-priority recommendations to the Director of OMB. These actionable and timely items will contribute directly to ongoing implementation of the Evidence Act and the establishment of a successful National Secure Data Service: 

      • Evidence Act Regulations: Provide additional guidance and regulations under the Evidence Act related to the operations and responsibilities of statistical agencies and implementation of the OPEN Government Data Act. 
      • Chief Statistician of the United States: Designate a full-time Chief Statistician of the United States within OMB.
      • Standard-Setting Procedures: Establish clear procedures for stakeholder engagement on future data standards for data sets government-wide. The importance of data standardization is a multi-faceted topic that includes considerations such as data quality, data definitions, legal frameworks, and reporting requirements, among others.
      • Appropriations Requests: Increase funding requests to support implementation of the Evidence Act and the Federal Data Strategy in the President’s Budget request to Congress in fiscal year 2023.
      • Value-Driven Pilot Program: Establish a pilot program including projects from federal agencies, states, and localities, to demonstrate the value of increased coordination and data sharing across government.
      • Privacy-Preserving Technologies Case Studies: Publish case studies where federal, state, and local governments used privacy-preserving technologies to encourage future, widespread use of these methodologies. This recommendation falls under the purview of the U.S. Chief Statistician in collaboration with the Interagency Council on Statistical Policy.
      • Communication: Develop a communication and education strategy to facilitate the success of a National Secure Data Service. This strategy should be developed by the U.S. Chief Statistician and should consider a wide range of stakeholders including the public, data providers, researchers, and policymakers at all levels of government.

      A number of subcommittees drilled down into specific focus areas and presented additional recommendations to the broader ACDEB. Focus areas included Legislation and Regulations; Governance, Transparency, and Accountability; Technical Infrastructure; Government Data for Evidence Building; and Other Services and Capacity-Building Opportunities. These preliminary recommendations will be integrated into the Committee’s Year 2 agenda as it looks to define the steps needed to fully operationalize the National Secure Data Service. In the next year, the Committee will continue to expand on its success to advance the use of data for evidence building and ultimately produce better results for the American people.


    • October 26, 2021 9:00 AM | Data Coalition Team (Administrator)

      Last week the White House Office of Management and Budget (OMB) released the Federal Data Strategy (FDS) 2021 Action Plan, an interagency effort meant to coordinate and leverage data as a strategic asset across the Federal government. Building upon the FDS 2020 and stakeholder engagement, the newly released strategy places emphasis on workforce development and data leadership within agencies.

      Part of the Executive Branch’s management agenda, the FDS is a 10-year plan to establish best practices for ethical data governance, management, and use. The FDS is an iterative process, with each Action Plan intended to incorporate lessons learned from agencies the prior year, public comments, and takeaways from conversations with data professionals from both government and non-government stakeholders –– such as the forum hosted last year by Data Coalition. 

      OMB identified major successes from Year 1 regarding the formation of agencies’ planning, governance, and data infrastructure foundation. For example, praise for the establishment of the interagency Federal Chief Data Officer (CDO) Council, the creation of a data upskilling pilot, and improvements to data inventories within Data.gov.

      Learning from Year 1’s successes, and identified challenges –– such as the need for more statutory requirements, published guidance on timelines, and additional interagency working groups –– the 2021 FDS lays out 11 action categories of 40 practices for agencies to implement going forward. Year 2 seeks to offer agencies more flexibility in achieving the Action Plan milestones in hope to meet agencies where they are in their foundational activities from FDS 2020. 

      Five out of 11 actions require specific interagency councils to identify pilot projects or government-wide services, highlighting the necessity of collaboration among data leadership. Some Year 2 practices include making public non-classified AI use case inventories, improved linkage and governance of wildfire fuel data, and creation of a data-skills training playbook. The 2021 FDS also reiterates goals from 2020, such as continued assessment of data to answer agency questions as well as maturation of data governance and infrastructure.

      Although the Data Coalition members appreciate the Year 2 strategy’s focus on workforce development and the role of data leadership within agencies, there are still many barriers to the next steps of implementation of improved data practices across the Federal government. On November 9, Data Coalition will be hosting a public forum to discuss key takeaways from the Action Plan, seek feedback to the Federal CDO Council’s recent Request for Information, and gather additional information on how to best assist in a collaborative effort to realize the full benefits of the evidence-informed policy in practice.


    << First  < Prev   1   2   3   4   5   ...   Next >  Last >> 


    1100 13TH STREET NORTHWEST SUITE 800
    WASHINGTON, DC, 20005, UNITED STATES
    INFO@DATAFOUNDATION.ORG

    RETURN TO DATA FOUNDATION

    Powered by Wild Apricot Membership Software