Log in

Blog

  • September 18, 2019 9:00 AM | Data Coalition Team (Administrator)

    One of the largest data collection exercises undertaken by the United States government every 10 years is about to begin – the decennial census. In June 2019 the Data Coalition announced its formal partnership with the U.S. Census Bureau to help promote an accurate count, with support from the business community.

    Data Coalition members met with U.S. Census Bureau Deputy Director Ron Jarmin to discuss the inflection points as the country prepares for the 2020 census, in addition to how the business community’s technology and data analytics firms can be more involved. Here are three key takeaways from the discussion:

    #1: The benefits of a high-quality, accurate count are both clear and significant. 

    The country needs a high-quality census count to support a range of activities beyond allocating representation in Congress, from evidence-based policymaking in government to data-driven decision-making in the private sector. Decision-makers throughout government rely on census to allocate grant funds, determine the number of households in a region for assessing program needs, and also use the information to benchmark other surveys, such as the American Community Survey. Similarly, the business community relies on census data to determine where to locate new offices or retail establishments and to inform market insights. 

    An accurate count is also a difficult feat because some populations are especially difficult to reliably count. Populations include migrant communities, children under five, and homeless individuals, just to name a few. All of these populations are important to capture because as population dynamics shift, government and businesses must be able to respond accordingly to ensure public and private services are efficient, effective, and meet the expectations of the American people. 

    #2: Census privacy and confidentiality protections are strong.

    While there has been much discourse over the past year about how certain data may be used in the contemporaneous environment to support decision-making, the Census Bureau data must be held in confidence. 

    The Census Bureau is only allowed to release summary statistics or information that does not include personal identifiers to the public. Any violation of this policy is also a violation of two separate laws – the Census Act and the Confidential Information Protection and Statistical Efficiency Act of 2018 – potentially carrying $250,000 in fines and up to 5 years imprisonment under each of the laws for each offense. Needless to say, the safeguards in law and throughout the Census Bureau’s professional staff are taken seriously and the American public should be assured that their confidential data will be strictly protected. 

    #3: Every organization can – and should – play a role in supporting an accurate count.

    There are numerous tactics that businesses and non-profit organizations can use to support an accurate census count. But whether large or small, all organizations can play a role in supporting the census. Potential examples of activities could include promoting the census on websites, encouraging employees to respond to the census in 2020 with emails or even breakroom posters, all the way through targeted support services to meet particular needs. 

    Data Coalition member Esri published a resource in July 2019 explaining relevant methodological and technology tools for supporting the geospatial capabilities needed for the census. Another Data Coalition member, Tableau, is supporting the Census Bureau’s efforts to track response rates once the census begins, so that local community organizers can have efficient metrics to support their efforts. Deloitte Consulting offers a variety of IT and management support roles to encourage support efficient execution of the 2020 census. New member USAFacts is working to promote the new features of next year’s census. The Census Bureau continues to search for partners in the business community for the 2020 census. 

    Other data and technology partners are critical for supporting the census as social media and internet efforts have rapidly advanced over the past decade. The 2020 census will allow responses through the internet making responding easier than ever, but the risk of misinformation campaigns and the presence of cybersecurity threats are real. Technology and data companies can help support the census in reducing the risks for executing a census in the modern world. 

    The Data Coalition will continue to support the effort to ensure the 2020 census results in useful data for the American people. 


  • September 06, 2019 9:00 AM | Data Coalition Team (Administrator)

    Over the past two years, the prospect of the United States government and key decision-makers becoming more steeped in evidence-based policymaking has become increasingly bright. 

    On September 7, 2017, the U.S. Commission on Evidence-Based Policymaking (Evidence Commission) released its final report to the President and Congress with a strategy for better using the data that government already collects. The report contained 22 unanimous recommendations that focused on responsibly improving access to government data, strengthening privacy protections, and expanding the capacity to generate and use evidence.

    Progress on Fulfilling the Commission’s Vision

    While action has not yet been taken on all of the Evidence Commission’s recommendations, significant progress has occurred over the past two years. Here are some key highlights of what transpired in the last two years:

    • Evidence Act. The Foundations for Evidence-Based Policymaking Act of 2018 (Evidence Act), enacted in January 2019, addresses half of the commission’s suggestions. It includes new directives around establishing federal government leadership roles for data management, program evaluation, and statistical expertise. The Evidence Act establishes new expectations for open data, data inventories, and improved data management. It also reinforces one of the strongest privacy and confidentiality laws in the country: the Confidential Information Protection and Statistical Efficiency Act, which ensures that when government pledges confidentiality to the American public all steps are taken to appropriately protect data. 
    • Federal Data Strategy. In parallel to the Evidence Act, the White House’s management agenda includes the development of a Federal Data Strategy to recognize data as a strategic asset. The agenda prominently incorporates many of the concepts and approaches developed by the commission in the 10-year plan for improving government data access, management, and use. While the formal action plan directing agencies is not yet final, it is expected to reinforce many of the recommendations from the Evidence Commission included in the Evidence Act. 
    • Guidance to Agencies. In summer 2019, the White House’s Office of Management and Budget (OMB) issued multiple guidance documents to agencies about addressing certain Evidence Commission recommendations. These included designating evaluation officers, appointing chief data officers, and identifying statistical experts, developing “learning agendas,” and incorporating new actions into annual budget and performance plans. 
    • Individual Agency Actions. In many cases, agencies started making progress in implementing the Evidence Commission recommendations even before guidance was issued. For example, the Small Business Administration developed and released its enterprise learning agenda in early 2018 and the Department of Health and Human Services developed an agency-specific data strategy released earlier in 2019.

    Next Steps on Fulfilling the Commission’s Vision

    The Evidence Commission set the stage for monumental changes in government data laws, processes, and culture. Agencies have initiated wholesale overhauls of data management practices and the recommendations are quickly becoming reality.

    But much work remains to fulfill the bipartisan vision outlined by the Evidence Commission – that government data are meaningfully analyzed to produce credible evidence that is actually used to inform policymaking. In the coming months and years, here are five areas for further attention:

    1. Development and Authorization of the National Secure Data Service. The commission’s headline recommendation was alluded to in the Evidence Act with the establishment of an advisory committee to develop a detailed roadmap, but implementing the National Secure Data Service will require Congress and the President to provide appropriate legal authority to ensure the envisioned data linkage capabilities adequately meet the American public’s privacy expectations.
    2. Improvement to Access for Income and Earnings Data. The commission prominently highlighted that income data are used as a valuable outcome metric for evaluating a wide range of federal programs and policies. Yet, these data are among the most difficult to access, even when privacy guarantees are in place. Additional efforts are needed to enable researcher access to the National Directory of New Hires as well as other targeted improvements to this information that serves as a valuable proxy measure for improvements to quality of life. 
    3. Reforms to Existing Government Processes. The commission highlighted existing limitations of government’s data collection requirements and some issues for allocating funding to meet specific data and evidence needs. Clear enhancements to the Paperwork Reduction Act are still needed.
    4. Exploration of New Technologies. While some new research on emerging technologies for safely conducting data sharing activities occurred in the past two years, there are critical gaps that remain. Government agencies and the private sector must deliberately invest in needed research to enable effective data management and use. 
    5. Sustained Attention for Effective Evidence Act Implementation. The broad requirements for federal agencies embodied in the Evidence Act will only be successful with sustained attention from senior agency leaders and with adequate funding. Congress must ensure agencies receive appropriate resources in the next fiscal year to capitalize on the momentum for improving data quality, increasing the availability of open data, and developing useful analytics and evaluation for policymakers. 

    Today, the Evidence Commission’s legacy can be celebrated as a substantial accomplishment in developing a sound and actionable strategy for better using government data. While more attention is needed to change government’s culture and operations to be more evidence-based, the early steps to better manage and use data are exceedingly promising.


  • September 04, 2019 9:00 AM | Data Coalition Team (Administrator)

    Guest blog by Jane Wiseman, Institute for Excellence in Government.

    As a fellow at the Harvard Kennedy School helping to support a national network of state chief data officers, I’ve had a front row seat to leading edge analytics for the past three years.  As a more recent observer of chief data officers in the federal government, I’ve been impressed by the excellence and diversity of innovative data work being done by the pioneering data officers in federal service.  

    While reading the Federal Data Strategy Action Plan, I was inspired by how detailed and thoughtful it is.  Actions address a range of important topics, and it’s wonderful to see data governance and data literacy getting attention, as these topics are not glamorous and sometimes get too little focus at the state and local level.  There’s an aggressive timeline in the action plan and a remarkable amount has already been accomplished.

    I was honored to share my thoughts at the Federal Data Forum hosted by the Data Coalition and the White House in early July, and was energized by the range of experts who shared their ideas. The simple fact that OMB is creating a data strategy for the federal government is one of the most exciting developments in data-driven government in my career and offers a tremendous opportunity to deliver more value for taxpayers.  

    Listening to other speakers during the forum, I was surprised twice – first at the common concern about the gap between the needed data literacy for government managers and current skill levels, and secondly by how few voices were calling for agencies to create and publish their data strategy.  

    Observation #1: The federal government needs to invest in data literacy.  

    Most leaders and decision-makers in government are not digital natives and many lack confidence in their data and digital skills.  And this gap will slow the adoption of data-driven government.  

    • With 2.5 quintillion bytes of data created every day, we are swimming in data.  Unfortunately, few senior leaders are fully comfortable with their own data knowledge and skills, or their organization’s capacity for data use – 4 out of 5 government leaders say decision-making is either somewhat or rarely data-driven according to a PwC study.
    • There are 90,000 units of local government in the US. Only a small percentage have a designated data leader such as a chief data officer (CDO), and only 20 have chief data officers who are part of the Civic Analytics Network (CAN), a peer network of leading state government data leaders.  
    • Data-driven government is about decision-making.  But if decision-makers are not data-literate, you can have all the data scientists in the world in an agency but decisions won’t use their results – it will be as if the results are in Greek and no one translated.

    Executives need to know how to ask for data analysis (what’s possible and what’s not), and how to critique results. They don’t need to be able to code or build an algorithm or map themselves, but they should know the power and capability of modern data tools. Basic data literacy means knowing how to ask good questions that inspire analysis, and then having the confidence to interpret and use the results. Let’s take a comparative study that was done between the US and 23 countries. In a comparison of the skills of the adult workforce in the US to those in 23 countries on "literacy, numeracy, and problem-solving skills," Japan and Finland led the pack on numeracy skills, while the United States ranked a disappointing 21 out of 23 participating countries. To close this achievement gap and to move toward large-scale organizational change to data-driven government will take a variety of training and coaching offerings. And it will take time.  

    Recommendation: The federal government should provide a full suite of data literacy training, support and coaching for senior leaders and management executives in government so that they have the confidence to lead data efforts.

    Observation #2: Each agency should have their own data strategy and plan. 

    As the saying goes, “If we don’t know where we’re going we might end up somewhere else.”  

    In the Federal Data Strategy Action Plan, actions 12-16 call for parts of what would be agency-specific data strategies. But this call to action falls short of asking each agency head to publish a multi-year data strategy and to report annually on progress toward achieving that strategy. We need to know where each agency is going in order to achieve data-driven decision making at every level of the organization and across all bureaus and agencies. Without a long-term strategy, we can’t hope for permanent culture change.  

    The elements that exist in the action items — from data governance to open data to important data questions —  need to be knit together into a multi-year plan for optimizing the use of data. Targets also must be set for each that can be reported on an annual basis in a cohesive integrated, department-wide plan.   

    With a clear charter from the chief executive, and armed with a strategy, a chief data officer can define a roadmap describing the difference the chief data officer team can make in government over a three- to five-year time horizon. The best chief data officers at federal, state and local levels operate from a strategy, a guiding document that sets mission and vision. They share their strategies and that helps them stay focused, and helps communicate to others what is and is not expected (e.g. chief data officers are not the one you call when you need someone to fix the printer and have an email server issue).  

    Agency heads must be the individuals ultimately responsible for their data strategy and must invest their chief data officers with the authority and resources to carry it out. It needs to be clear to all the chief executive has a strong relationship with and relies on data, and views the chief data officer as a trusted and important resource to the organization.  

    Strategy is about mission and outlining key questions. Asking good questions matters and clearly defining how the analytics research question will generate public value is important. The importance of understanding the connection to priority policy questions is summed up well by one of the CDOs I interviewed last year who said, “You might be Lord Algorithm but if you don’t stop to understand the problem, you will never succeed in making government better.”   

    When the UK created the Government Digital Service in 2011, their focus was on long-term transformation of government to digital services, and they’ve made consistent progress every year.  But it didn’t all happen overnight. One of the keys was to make sure every agency had a point person and that he or she had a roadmap. We need that same level of focus on individual federal agency level strategies.  

    Recommendation: OMB should hold agency heads accountable on an annual basis for progress on achieving data-driven government, should require them to be the leader of their department’s multi-year strategic plan for data, and require the publication of their data strategy.  

    Successful Implementation Holds Promise for Lasting Impact 

    With this exciting momentum at the federal level toward becoming more data-driven in decision-making, there is a tremendous opportunity for the federal government to support the development of capacity in state and local government as well.  For example:  

    • The tools created for federal agencies will benefit state and local governments, where there is far less capacity to build knowledge capital and to document tools and methods.  A great example is the data ethics framework and data protection toolkit called for in actions 3 and 4 in the federal data strategy.
    • The federal government should create Data Analytics Hubs or Centers of Excellence in a service model, ensuring state and local governments do not have to replicate analytics capabilities Rather, major economies of scale can be achieved if the federal government establishes analytics services hubs to provide data analysis for local governments.
    • Procurement expertise and advice could also be part of these Centers of Excellence. With the rapid growth of the analytics and data mining vendor field, it would be impossible for any local official to have time to stay abreast of trends and to know the right questions to ask to assure that procurements are able to properly protect data privacy and security and to properly take into account data ethics.  A central resource at the federal level could support better investments at the state and local level and reduce the risk of data breaches and improper or wasteful spending of state and local funds for data analytics projects. 

    The federal government has a unique opportunity at this moment to help incubate and advance the field with actions in every agency. Leadership and investment in capacity now will pay dividends long into the future.  

    The Federal Data Strategy, with an iterative and collaborative approach, should support agencies moving forward. If it’s done right, the strategy will lead to a major transformation of government.  


  • August 06, 2019 9:00 AM | Data Coalition Team (Administrator)

    Whether government should use data to improve society is no longer up for debate – the answer is definitively yes. When high-quality government information is accessible, it can be applied to generate insights that help decision-makers develop better policies for the American people. We’ve seen successes in early education, disease prevention, and government transparency efforts, and more could be done with better access to data.

    For too long, our government operated in silos to address some of the issues related to data access and use, without planning for a wide range of data users. Too frequently, the data community reinforces its own silos rather than working in collaboration. That is why the Data Coalition is launching GovDATAx Summit in 2019

    Our inaugural GovDATAx Summit will be the beginning of a renewed dialogue about how we improve government’s data policies to truly unleash the power of data for the public good. The data community should work to empower innovators to generate insights that improve our economy and the quality of life for the American public. The conversation at GovDATAx is intended for an audience that:

    • Recognizes data as a strategic asset;
    • Prioritizes government’s need to establish an environment for accessing information; and
    • Values responsible and ethical uses of data. 

    During the Summit’s main sessions, experts from the White House, agency leadership, academia, and the private sector will discuss important new bipartisan policies that are being implemented this year, like the Evidence Act, which establishes new data leaders – chief data officers, evaluation officers, and statistical experts – across the federal government. GovDATAx will feature a discussion of important data standards that lead to better data quality, promote opportunities for public-private partnerships, and present exemplars about what works for using data as an asset. 

    Be a part of shaping the future of government data. Join the Data Coalition and hundreds of leaders from across the public sector, businesses, non-profits, and academia on Wednesday, October 30 to discuss the next steps for developing policies that unleash data for good. 


  • July 10, 2019 9:00 AM | Data Coalition Team (Administrator)

    The American public provides an incredible amount of information to the federal government – valued at $140 billion each year. This information is provided as part of businesses complying with regulations, individuals and firms paying taxes, and individuals applying for programs. 

    The development of the Executive Branch’s Federal Data Strategy is an effort to better organize and use all of that information to improve decision-making. On July 8, 2019, the Data Coalition joined the White House to co-sponsor a public forum gathering feedback on what actions the federal government will undertake over the next year to begin implementing the data strategy. 

    Kicking off the event, Dr. Kelvin Droegemeier, Director of the White House’s Office of Science and Technology Policy, stressed a key goal of the strategy is to “liberate” government data for society’s use, noting that nearly 90 percent of government data go unused. During the forum, 52 speakers – including 12 members of the Data Coalition – and more than 100 other experts provided input on how to improve the draft plan. 

    Here are fourtake-aways from the public comments provided during the forum:

    #1. Leadership is Essential for Realizing Culture Change

    New legislation enacted in early 2019 creates several new positions in government agencies, such as the chief data officers, evaluation officers, and statistical experts. Throughout the public forum, speakers stressed the need for these leaders to be empowered to institute changes within federal agencies, and informed about how to most effectively implement best practices for data access, management, and use.  

    Several speakers specifically stressed the critical role for newly established chief data officers in improving data quality and usefulness across government, in addition to providing improved training and tools for agencies to equip the federal workforce to use data. The concept of data literacy was also prominently featured, meaning that throughout federal agencies the workforce should be trained routinely about responsibilities and techniques for responsibly managing and using data. 

    #2. Targeted Data Standards Offer Opportunities for Efficiency

    The need for improved data standards was discussed by speakers on more than half the panels during the event, with suggestions that the Federal Data Strategy could do more to encourage standards in the areas of financial regulatory reporting, agency spending and grant management, geospatial data, and organizational entities. For example, multiple speakers highlighted the potential opportunity to include more direction about the adoption of common business entity identifiers, like the globally-recognized Legal Entity Identifier (LEI) created, as a means of improving analytical capabilities while also reducing reporting burdens from regulated entities. 

    #3. Partnerships are an Important Element for Success

    Many speakers noted their appreciation for the opportunity to provide feedback on the data strategy, and encouraged ongoing collaboration with those outside government through public-private partnerships. As the strategy is implemented over the next year, industry, non-profits, academics, and others in the American public should have opportunities to weigh in and hold agencies accountable for achieving the stated goals in the plan. Partnerships also offer agencies a specific means to coordinate with those outside of government to ensure the implemented policies and practices achieve meaningful improvements in the country’s data infrastructure. 

    #4. Coordination at OMB and Agencies is Key

    Finally, because government data collected by one agency are often relevant for another, coordination is a critical component of success in the Federal Data Strategy. Speakers highlighted that the proposed OMB Data Council could serve as a model for agencies about how to work across interests, laws, and policy domains to achieve lasting change. But coordination must be achieved or promoted not just by OMB; it is also a responsibility for every agency to coordinate within its own programs and leaders to promote culture change, a data literate workforce, and to allocate resources to achieve the goals of the strategy.

    In the coming months, the action plan will be finalized and publicly released, incorporating the comments from the Coalition-White House public forum along with other written feedback. The Data Coalition looks forward to continuing to partner with the federal government to ensure our national data policies truly make data an asset for the country.  

    To read the Data Coalition’s comments on the Strategy’s Draft Action Plan, click here.


  • May 16, 2019 9:00 AM | Data Coalition Team (Administrator)

    The following comments were adapted from the RegTech Data Summit Keynote Address of Ben Harris, Chief Economist, Results for America – Former Chief Economist and Economic Advisor to Vice President Joe Biden. Delivered April 23, 2019 in New York City.

    In a time of seemingly insurmountable partisanship, Congress was able to come together around the issue of evidence-based policy and pass the Foundations for Evidence-Based Policymaking Act (Evidence Act) that makes some dramatic changes to our ability to learn from data and evidence in our quest for better policy. As many of you may know, the OPEN Government Data Act—which was included in the Evidence Act—implements some remarkable changes, including:

    • Installing Chief Data Officers at all federal agencies
    • Documenting and coordinating the massive breath of data collected by agencies; and
    • Directing that non-sensitive government data be open by default.

    To start, it’s probably worthwhile to acknowledge that issues like data standardization and calls for access to open data might not be the sexiest topics, but we all can appreciate their importance.

    As a newcomer to this topic, I have only begun to understand and appreciate the need for widespread access to standardized, machine-readable data.

    The need for standardized and accessible data is crucial, but also the critical need to inject more evidence-based policy into our system of legislation and regulation.

    I have come to believe that our whole economy, not just government regulators, face a massive information deficit. Our economy, which now runs on data and information more than ever, still has gaping holes in the availability of information that undermines markets and can lead to widely inefficient outcomes.

    When it comes to evidence-based policymaking, our government has a long way to go. As pointed out in a 2013 op-ed in The Atlantic by Peter Orszag and John Bridgeland, only about one percent of our federal dollars are allocated based on evidence and evaluations. From my perspective, this is about ninety-nine percentage points too few.

    The lack of evaluation can inject massive and longstanding inefficacies into our federal, state, and city-level budgets resulting in wasteful spending and missed opportunities to improve lives. This is never more evident than in our country’s $1.5 trillion tax expenditure budget. We have hardly stopped to ask whether the $1.5 trillion spent annually in targeted tax breaks are achieving their desired objectives.

    The benefits of better evidence and data extend well-beyond direct spending and tax administration. It can mitigate the economic pain caused by a recession. Indeed, the severity of the financial crisis was exacerbated by the information deficit in the wake of Lehman’s collapse and the inevitable chaos that followed. Had financial firms and regulators been able to more accurately and quickly assess the extent of the damage through standardized financial data, we would have seen less radical actions by investors to withdraw from credit risk and more effective government intervention. Of all the factors that played a role in the crisis, I don’t think it’s hyperbole to say that lack of data standardization is perhaps the least appreciated.

    Evidence-based policy is also not just a matter of better government. It’s about people’s faith in government in the first place. Results for America recently commissioned a nationally representative survey about Americans’ attitudes about the role of evidence in policymaking. When asked about “what most drives policymakers’ decisions” a whopping forty-two percent said “boosting popularity, or getting votes” while thirty-four percent said it was the influence of lobbyists and just eight percent said it was evidence about what works. Surely these responses are cause for concern.

    Fortunately, there are solutions.

    To start, in a time when there are seemingly no bipartisan bills, we saw the passage of the Evidence Act—which is known to some as the umbrella bill for the OPEN Government Data Act. As I noted at the beginning, the Evidence Act represents a major step forward not just for the capacity of government agencies to implement evidence-based policy, but for the public to gain access to open, machine-readable data.

    Of course, this law is the beginning, not the end. We can help solve private market inefficiencies by calling for more data.

    • When it comes to better understanding the fees charged by financial advisers, the U.S. Securities and Exchange Commission (SEC) can amend Form-ADV to include explicit questions on fees charged. It’s that simple.
    • When it comes to evaluating government programs, I can think of no more powerful tool than providing federal agencies a 1 percent set-aside for evaluation. Results for America has called for this for years, and it’s time that Congress pick up the charge.
    • When it comes to evaluating the $1.5 trillion tax expenditure budget, we’ll have to make some institutional changes. One option is to expand the capacity of a federal entity, like the Internal Revenue Service (IRS) or the White House Office of Management and Budget (OMB), to include periodic evaluations of this budget. Another is to call for regular Congressional approval, similar to the process for appropriations.
    • And as we prepare for the possibility of the next recession, we also need to finish the progress made in earnest to make adoption of Legal Entity Identifiers (or LEIs) ubiquitous across the financial sector. While the progress since the great recession has been impressive, we have more work to do to ensure this system covers not only entities in the U.S., but our economic allies as well.

    These reforms can and should be viewed as steps to aid the private sector, hopefully leading to better economic outcomes, lessened regulatory burdens, or both.

    On the whole, I am clear-eyed about the challenges faced by advocates for evidence-based policy. The passage of the Evidence Act it is clear that progress can be made. To me, it feels like we are on the cusp of a new movement to incorporate data and evidence in all that government does. Together we can help ensure that policy does a better job of incorporating data and evidence, leading to improved lives for all Americans.


  • April 05, 2019 9:00 AM | Data Coalition Team (Administrator)

    In recent years, we have seen an explosion of regulatory technology, or “RegTech.” These solutions have the potential to transform the regulatory reporting process for the financial industry and the U.S. federal government. But RegTech can only thrive if government financial regulatory agencies, like the Securities and Exchange Commission (SEC), the Commodity Futures Trading Commission (CFTC), and the Federal Deposit Insurance Corporation (FDIC), adopt structured open data standards for the forms they collect from the private sector. We have seen changes and momentum for RegTech adoption is picking up, but there is much more to be done.

    At this year’s RegTech Data Summit on Tuesday, April 23, in New York, we’ll explore the intersection of regulatory reporting, emerging technology, and open data standards with financial regulators, industry leaders, RegTech experts, academics, and open data advocates.

    The Data Coalition has long advocated for RegTech policy reforms that make government regulatory reporting more efficient and less burdensome on both agencies and regulated entities. The benefits are clear. Unified data frameworks support efficient analytical systems; common, open data standards clear the path for more accurate market risk assessments among regulators and bolster transparency.

    The Summit comes at an opportune time. Federal financial regulators have already begun replacing document-based filings with open data standards.

    Within the past year, the SEC voted to mandate inline eXtensible Business Reporting Language (iXBRL) for corporate financial filings. The Federal Energy Regulatory Commission (FERC) proposed a rule change that would require a transition to XBRL from XML. The House of Representatives held the first-ever hearing on Standard Business Reporting (SBR). The Financial Stability Oversight Council (FSOC) reiterated its recommendation for the adoption of the Legal Entity Identifier (LEI) to improve data quality, oversight, and reporting efficiencies.

    There are also a number of international examples of government-wide adoption of open data that can serve as a guide for similar efforts in the U.S., which we will explore at our Summit. Standard Business Reporting (SBR), as successfully implemented by Australia, is still the gold standard for regulatory modernization efforts. By utilizing a standardized data structure to build SBR compliance solutions, the Australian government was able to streamline its reporting processes and save their government and private sector $1 billion AUD from 2015-2016.

    The success of SBR in Australia is undeniable. During our Summit, panelists will discuss how Congress is considering policy reforms that will enable the adoption of RegTech solutions to theoretically achieve the same savings as Australia. The Coalition has supported the Financial Transparency Act (FTA) since its introduction (H.R. 1530,115th Congress). The FTA directs the eight major U.S. financial regulatory agencies to collect and publish the information they collect from financial entities in an open data form, electronically searchable, downloadable in bulk, and without license restrictions.

    Once financial regulatory reporting is expressed as standardized, open data instead of disconnected documents, RegTech applications can republish, analyze, and automate reporting processes providing deeper insight and cutting costs.

    Entity identification systems are another pain point for the U.S. regulatory community. A recent Data Foundation report, jointly published with the Global Legal Entity Identifier Foundation (GLEIF), discovered that the U.S. federal government uses at least fifty distinct entity identification systems – all of which are separate and incompatible.

    If widely and properly implemented in the United States, a comprehensive entity identification system based on the LEI could help identify and mitigate risk in financial markets, track and debar low-performing federal contractors, improve supply chain efficiency, and generally be useful anywhere a government-to-business relationship exists. By working together, industry and government leaders can reap the benefits of these emerging RegTech solutions and open data applications.

    Karla McKenna, who is Head of Standards at GLEIF and specializes in international financial standards, and Matt Reed, Chief Counsel at the U.S. Treasury’s Office of Financial Research, are among the leading voices we will hear from at the Summit. Together with Ken Lamar, former Special Vice President at the Federal Reserve Bank of New York, and Robin Doyle, Managing Director at the Office of Regulatory Affairs at J.P. Morgan Chase, they will analyze the status of open standards and the impact a single entity identifier.

    We’ll be delving into RegTech applications like blockchain, analytic applications, and AI systems, as well as policies that will transform regulatory reporting like the FTA, and more at the second annual RegTech Data Summit on April 23. The Summit will convene financial regulators, industry leaders, academics, and open data advocates to discuss the latest innovations in regulatory technology and what the future holds.

    Summit-goers will have the opportunity to hear from SEC, Treasury, FDIC, and J.P. Morgan Chase representatives just to name a few. Featured speakers include former SEC Commissioner Troy Paredes; Dessa Glasser, Principal, The Financial Risk Group and formerly CDO, J.P. Morgan Asset Management; and Mark Montoya, Senior Business Analyst, FDIC.

    The Summit will focus on three main themes as we explore the future of U.S. regulatory reporting technology:

    • Enterprise Digitization: The modern enterprise faces a myriad of internal and external data challenges. By internally aligning common data formats and adopting open standards, financial institutions can build a competitive information foundation to more efficiently leverage emerging technology.
    • Open Data Standards: Adopting a single, open data standard for entity identification among U.S. regulatory agencies would create a framework for financial institutions and regulators to more accurately assess market risk, improve reporting efficiencies, lower transaction costs, and improve data quality.  
    • Reporting Modernization: By adopting open data standards, the U.S. government will be able to improve oversight and provide higher levels of accountability to citizens; facilitate data-driven analysis and decision making in agencies; and expand the use of automation, which will reduce compliance costs.

    It is clear that RegTech solutions will disrupt compliance norms by increasing efficiency, enhancing transparency, and driving analytics. However, successful implementation of this technology is only possible when government and industry focus on collecting, reporting and publishing quality and structured data. If you are eager to explore the future of compliance in which document-based regulatory reporting will become a thing of the past, then join us at the second annual RegTech Data SummitThe Intersection of Regulation, Data, and Technology.

    For more information on the Summit, check out our event webpage

  • March 08, 2019 9:00 AM | Data Coalition Team (Administrator)

    When President Trump signed the Foundations for Evidence-Based Policymaking (FEBP) Act (P.L. 115-435) in January, the Data Coalition celebrated a major milestone for open data legislation in the federal government. Title II of the law, the Open, Public, Electronic, and Necessary (OPEN) Government Data Act, is a transformative open data policy that modernizes the way the government collects, publishes, and uses non-sensitive public information. The law mandates that all non-sensitive government data assets be made available as open, machine-readable data under an open license by default. The Data Coalition advocated for this legislation for over three years and it is now law. So, what next?  

    The Data Coalition, Center for Data Innovation (CDI), and the American Library Association hosted a joint panel to discuss the OPEN Government Data Act’s impact on the future of open data in the United States. The Coalition’s own Senior Director of Policy, Christian Hoehner, as well as representatives from BSA | The Software Alliance, the Internet Association, SPARC, and the Bipartisan Policy Center, discussed what this new law means for government modernization, data-centric decision making, and the implementation of successful federal open data initiatives.

    Congressman Derek Kilmer (D-WA-6), an original sponsor of the OPEN Government Data Act and Chairman of the newly established House Select Committee on Congressional Modernization, provided opening remarks that touched upon the bills wide-reaching benefits as well as the future of open data policy and implementation. Congressman Kilmer touted the new law’s potential to create more economic opportunity for people in more places. Greater access to government data will allow Americans to start new businesses, create new jobs, and expand access to data and resources often concentrated in urban areas.

    The Congressman emphasized that the law passed with strong bipartisan support. He observed that the simple notion of giving taxpayers additional access to public data is beneficial for citizens.

    “Simply put, the OPEN Government Data Act gives the data the government collects to the people who pay for it, which is all of you,” Congressman Kilmer said during his remarks. “The bill passed because people across the increasingly wide political spectrum know that making access to government data…is a good idea.”

    Opening valuable government data assets will increase government accountability and transparency as well as technological innovation and investment. Under the law’s mandates, public government data will be made machine-readable by default without compromising security or intellectual property. This is a major victory for the open data movement, but as Congressman Kilmer acknowledged, this is certainly not the last step. There is still much to be done to ensure the law is successfully implemented.

    While the OPEN Government Data Act draws a lot of its mandates from already established federal orders – specifically President Obama’s 2013 “Open Data Policy-Managing Information as an Asset” (M-13-13) – the law adds additional weight and formalizes a number of open data best practices. The law legally establishes definitions for “open license,” “public data asset,” and “machine-readable,” which clarify the specific assets under scrutiny in this law. As Nick Shockey of the Scholarly Publishing and Academic Resources Coalition (SPARC) stated, the strong definitions behind open licensing will be great in strictly codifying data procedures around research and beyond.

    Establishing these definitions was critical as it reinforces another key part of this law: requiring the publication of public data assets as “open government data assets” on agencies’ data inventories. Under M-13-13, agencies tried to establish a comprehensive data inventory, but internal silos and other barriers prevented the publication of certain data sets.

    Nick Hart of the Bipartisan Policy Center drew from an example to explain the barriers. “The Census Bureau, for example, has some data that they received from other agencies. The other agencies may not have reported it on their inventory because they gave it to the Census Bureau, [and] the Census Bureau didn’t report it because it wasn’t their data.” Identifying the extent of government data assets has been a constant challenge for researchers and industry, but the open data inventories mandated in Title II will clarify exactly what public data assets the government has on hand.

    The open data inventories will contain all government data assets that aren’t open by default but would otherwise be available under the Freedom of Information Act (FIOA). “That’s going to, I think, make these data inventories a lot more robust and a lot more useful to researchers as they’re trying to identify what data an agency has that might not be currently made available through Data.gov," said Christian Troncoso, Policy Director at BSA | The Software Alliance.  

    Another important aspect of the OPEN Government Data Act discussed during the event was the establishment of an agency Chief Data Officer (CDO) Council. Every federal agency is now required to appoint a CDO, and these individuals will be tasked with implementing the components of the new law. One major challenge going forward will be how federal agencies establish and equip their CDO functions. That is, will they recognize that, as the law intends, the CDO function should be distinct and independently established apart from traditional information technology leadership functions. The hope is that CDOs will build communities of practice and work with their fellow members of the Council to share best data practices that could be implemented across agencies.

    In the end, CDOs will not just be Chief Information Officers under a different name. They will be the sentinels of quality, accurate and complete agency data and, hopefully, shift the culture to one of data management and data-driven decision making. As our Senior Director of Policy, Christian Hoehner said, better education about how data will be utilized by agencies and the public will help to incentivize agencies to ensure compliance with the law.

    The panel served as a celebration of how far open data has come and judging by the opinions of all panelists, the government is on track to continue expanding its open data initiatives. The Federal Data Strategy and President's Management Agenda show that the government recognizes the value of its vast stores of data. These initiatives will run parallel to the implementation of the OPEN Government Data Act, and in some cases, they will also intertwine.

    These executive efforts, including a recent Executive Order on U.S. artificial intelligence, and bipartisan congressional support show that open data utilization is a government priority, and our Coalition is pleased to see action by the Executive Branch, Congress, and agencies.

    While there is still plenty of work to be done once the law is officially enacted on July 8, 2019,  the passage and support of the OPEN Government Data Act is an indication that the government is moving closer to modernizing its processes and management using standardized, open data. The Data Coalition looks forward to working with the government on successfully implementing this transformative legislation.

    If you would like to watch the panel, “What’s Next for Open Data in the United States,” click here.


  • November 15, 2018 9:00 AM | Data Coalition Team (Administrator)

    Earlier this year, the Administration released their vision for the way our government can better collect and leverage data in four main categories:

    1. Enterprise Data Governance.
    2. Access, Use, and Augmentation.
    3. Decision Making & Accountability.
    4. Commercialization, Innovation, and Public Use.

    The Federal Data Strategy will define principles and practices for data management in the federal government. The Office of Management and Budget (OMB) is collaborating with the private sector, trade association, academia, and civil society to gather feedback and comments on the proposed strategy.

    The Data Coalition joined the Bipartisan Policy Center (BPC) and OMB to co-host a public forum discussing the Federal Data Strategy, last week (November 8th). The second in a series on the strategy, this forum allowed the public, businesses and other stakeholders to comment on the recently published draft set of practices. Data Coalition members DeepBD, Elder Research, Morningstar, SAP, Tableau, and Xcential as well as Data Foundation supporter companies Kearney and Company and REI Systems provided feedback on the strategy of the proposed practices. In their comments, members emphasized the value of federal data as applied in analytics, a standards-based modeling path, and the use of machine-readable forms that would create a better link between government services and citizens.

    Most commenters acknowledged that the Federal Data Strategy strategy represents an effort to initiate much-needed changes to the cultures around data across agencies and offered ways to improve the practices and implementation. Some attendees emphasized the need for greater clarity on the draft practices and provided examples of how the government can maximize the value of its data. Clarity and direction, they argued, would help move the strategy from an idea to a potential set of actionable steps for cultural changes.

    Better data utilization and management were was noted as a key to the success of the strategy. The Digital Accountability and Transparency Act (DATA Act) has significantly increased the quality of the data reported to the government. Our members who provided public statements were quick to bring attention to these improvements and how the DATA Act set the groundwork to fortify potential efforts to reach CAP Goals 2 (Data as a Strategic Asset) and 8 (Results-Oriented Accountability for Grants Management).

    According to Sherry Weir of Kearney & Company, if OMB starts with a budget request in the same standardized data format, the U.S. Treasury and agencies could then merge DATA Act data reported information (USAspending.gov) with federal appropriations bills. This connection is only possible with intelligent data stewardship, but it has the ability to connect otherwise disparate datasets across the budget lifecycle and provide insights that can motivate better more informed federal spending and policymaking.

    Throughout the day, a few commenters expressed concern over the complexity of the current draft strategy. They pointed out that the strategy, laid out across forty-seven practices and organized into ten principles, is too unwieldy for executive decision makers to readily articulate across their organization’s. The MITRE Corporation suggested that the strategy could be cut down to a single-page reference document and provided an example.

    It would be no simple task to distill the strategy. Panelists suggested that the Federal Data Strategy Team looks for small wins in data modernization efforts to build momentum on the larger goals.

    Larger conclusions presented by commenters included a view that the strategy fails if public servants cannot work across agency data silos to make better, data-driven management decisions that best serve the public.

    Stewardship is key to the success of the Federal Data Strategy, and the Administration needs sustained leadership to guide it in order to create to most value out its vast stores of data. With the feedback of all these industry leaders, advocates, and data experts, OMB is now tasked with using the public perspective to build a data strategy that facilitates efficient government data management.

    The Data Coalition was thrilled to partner with BPC and OMB on this important forum. Audio recordings of the forum are available online as well as social media coverage of the event. As a reminder for interested parties, public comments on the updated Federal Data Strategy are due by Friday, November 23. Comments can be submitted online in various forms. The Data Coalition will be providing its own written comments on the Federal Data Strategy that will we hope the Administration strongly considered when forming the final strategy.


  • September 07, 2018 9:00 AM | Data Coalition Team (Administrator)

    Modernizing financial regulatory reporting is no easy task. Regulators and regulated entities continue to rely on outdated technology and reporting systems. Open data standards are the key: by replacing document-based reports with standardized data, regulators can spur modernization.

    Data companies behind the Data Coalition have the solutions to make financial regulation more efficient and transparent for both regulatory agencies and the industry. Standardized data can be instantly analyzed, updated, and automated.

    On September 12, 2018, we brought together current and former Treasury and SEC officials, global reformers, and industry experts to explore the ongoing shift in financial regulatory reporting from documents to data — and its profound benefits for both regulators and regulated.

    Modernizing Financial Regulatory Reporting: New Opportunities for Data and RegTech” was an opportunity for attendees to better understand how data standards will enable new RegTech and blockchain applications for the financial industry at our first-ever New York City event. The event informed and encouraged key stakeholders to move forward with financial regulatory reporting reforms.

    The half-day gathering at Thomson Reuters headquarters in Times Square highlighted the modernization initiatives already underway and looked to the future. Here is a glimpse of what attendees learned throughout the event:

    Open data standards are becoming the norm – Thomson Reuters is leading the charge, Washington is making moves

    Open PermID Linked Data Graph. Source: https://permid.org/

    Thomson Reuters has moved away from the traditional model of charging for a standard, signaling the growth of the open data ecosystem. Rather than selling the standard itself, financial services leaders view the data and analysis on financial entities and instruments as a value-add service for clients. Thomson Reuters developed the Open PermID, which exemplifies open data developments in financial services, and is in line with the open-data movement happening in Washington.

    The PermID has created an efficient, transparent, and systematic market. As Thomson Reuters  states:

    The ID code is a machine-readable identifier that provides a unique reference for data item. Unlike most identifiers, PermID provides comprehensive identification across a wide variety of entity types including organizations, instruments, funds, issuers, and people. PermID never changes and is unambiguous, making it ideal as a reference identifier. Thomson Reuters has been using PermID in the center of our own information model and knowledge graph for over seven years.

    Open data standards like PermID, used for financial instruments, and the Legal Entity Identifier (LEI), can provide meaningful changes in the way financial data is collected and synthesized. Both the PermID and LEI are examples of how scalable open standards can improve internal efficiency and external transparency for financial regulatory reportingWhile the private sector develops viable use cases, policymakers in Washington are taking action to drive modernization in financial regulatory reporting.

    State of Financial Regulatory Reporting

    In Washington, three key policy milestones have occurred over the past 18 months, which demonstrates that agency officials, the Administration, and Congress are driving modernization.

    July 16, 2018: House Financial Services Committee ultimately did not include an anti-SEC data measure in their House passed JOBS & Investor Confidence Act  — a package of thirty-two bipartisan job creation bills. The Small Company Disclosure Simplification Act (H.R. 5054) was ultimately not included in the compromised JOBS Act 3.0 package remains a controversial measure lacking broad support in the Committee.

    June 29, 2018: The SEC voted to adopt Inline XBRL for corporate financial data disclosure (see the final rule). The move to Inline XBRL will end duplicative documents-plus-data financial reporting and transition to data-centric reporting. This initiative is a part of a broader modernization of the SEC’s entire disclosure system. The Data Coalition and its member companies (see comments from Workiva, Deloitte, Morningstar, and Grant Thornton) have long been supporting the adoption of Inline XBRL a the SEC. The Coalition’s comment letter further explains our support of the SEC’s decision to adopt the use of iXBRL (see our comment letter).

    June 13, 2017: Treasury Secretary Steven Mnuchin testified before the House Appropriations Committee in defense of the Treasury Department’s Fiscal Year (FY) 2018 Budget request. Mnuchin’s testimony showed an opening to standardize data fields and formats across the nation’s overlapping financial regulatory regimes – just as the Data Coalition has already been recommending to Congress.

    Regulators need standards for accuracy, analysis, and fraud reduction

    Financial regulators rely heavily, in some cases solely, on corporate filings and data analytics to detect fraud schemes – including the practice colloquially known as ‘Pumping and Dumping’. This scheme attempts to inflate the price of a stock through recommendations based on false, misleading, or greatly exaggerated, statements. Corporate financial filings are essential to accurately identifying and extracting fraudulent activities, ultimately protecting investors.

    If financial regulators adopted data standards across all reporting systems, it would make identifying fraud far easier. That is why our Coalition is working to persuade Congress to pass the Financial Transparency Act (H.R. 1530) (summary here). The FTA would require the eight major financial regulators to adopt common data standards for the information they collect from the private sector. The bill has gained thirty-two bipartisan cosponsors. When passed, it will be the nation’s first RegTech law.

    Most notable “pump and dump” scheme include: ZZZZ Best Inc., Centennial Technologies and Satyam Computer Services.

    Why the financial industry should embrace open data standards

    During the final panel of the day, attendees heard financial industry experts describe why their colleagues should get behind open standards, and why the financial industry should welcome regulatory action.

    Currently, financial entities rely on mostly manual reporting processes as they send information to government regulators. Under this outdated system, companies typically have one group who is responsible for preparing and another who is responsible for issuing the reporting. For larger companies, error is nearly inevitable in such a structure that is heavily reliant on layers of human oversight.

    Data standardization means lower compliance costs for the financial industry, more efficient enforcement for regulators, and better transparency for investors – but only if regulators work together to modernize.



1100 13TH STREET NORTHWEST SUITE 800
WASHINGTON, DC, 20005, UNITED STATES
INFO@DATAFOUNDATION.ORG

RETURN TO DATA FOUNDATION

Powered by Wild Apricot Membership Software