Data Journalism: Using Data to Report the News

Data journalism sits at the intersection of statistical analysis, database reporting, and narrative storytelling, and its practice has reshaped how public-interest stories are uncovered, verified, and communicated. This page covers how data journalism functions as a professional discipline, the tools and methods practitioners use, the types of stories it produces, and how editorial decisions are made when data drives the reporting. The field spans newsrooms ranging from large national outlets to investigative journalism teams operating at nonprofit publishers.

Definition and scope

Data journalism is the practice of acquiring, cleaning, analyzing, and visualizing structured datasets to inform or construct a news story. It extends beyond simple chart-making — at its core, it applies analytical reasoning to public records, administrative databases, financial disclosures, and sensor-generated data to surface patterns that are not visible in individual documents or interviews.

The National Institute for Computer-Assisted Reporting (NICAR), a program of Investigative Reporters and Editors (IRE), has trained journalists in database analysis since 1989. IRE and NICAR maintain the largest professional training infrastructure for this specialization in the United States. The Global Investigative Journalism Network (GIJN) documents comparable practices across more than 90 countries.

Data journalism differs from traditional reporting along at least 3 distinct axes:

  1. Source material: Data journalism relies on machine-readable records — spreadsheets, SQL databases, APIs, geographic information system files — rather than documents alone.
  2. Verification method: Patterns in data are tested through statistical methods; individual anomalies are confirmed through traditional sourcing.
  3. Output format: Stories often include interactive graphics, downloadable datasets, or methodology notes published alongside narrative text.

The scope extends to any beat. Budget data drives coverage of municipal finance; crime statistics inform public safety reporting; campaign finance records underpin coverage of elections. News reporting standards across professional journalism require that data-driven claims meet the same accuracy thresholds as any other factual assertion.

How it works

A data journalism workflow typically follows a structured sequence, regardless of the outlet or beat:

  1. Data acquisition: Obtaining datasets through FOIA requests, direct government portal downloads, API queries, or public records scraping. Federal agencies publish substantial data through sources such as Data.gov and the Federal Election Commission's data portal.
  2. Data cleaning: Raw government data frequently contains duplicates, inconsistent formatting, missing values, and coding errors. This phase can consume 60–80% of total project time on complex datasets, according to practitioners surveyed at NICAR conferences.
  3. Analysis: Journalists apply sorting, filtering, pivot tables, and statistical functions using tools including Microsoft Excel, R, Python, and SQL. Geographic analysis uses QGIS or ArcGIS.
  4. Verification: Findings are cross-checked against source documents, and named experts or affected parties are given opportunity to respond.
  5. Visualization and publication: Charts, maps, and tables are built using tools such as Datawrapper, Flourish, or custom JavaScript libraries. Methodology notes document how analysis was performed.

The distinction between exploratory analysis (hypothesis generation) and confirmatory analysis (testing a specific claim) matters editorially. Exploratory work may lead nowhere publishable; confirmatory work informs the framing and sourcing of a finished story.

Common scenarios

Data journalism surfaces across beats and story types in recognizable patterns:

Each of these scenarios produces stories that could not be reported through interviews alone. The dataset becomes a primary source alongside human sources.

Decision boundaries

Not all data is suitable for publication, and data journalism requires active editorial judgments about when quantitative findings are strong enough to carry a story.

Threshold questions that shape editorial decisions include:

Journalism ethics standards apply to data sourcing just as they do to human sourcing. Misrepresenting the origin, completeness, or analytical method behind a dataset constitutes the same kind of failure as misquoting a source. Outlets including ProPublica publish full methodology notes and, where legally permissible, release the underlying data for public review.

Data journalism intersects directly with concerns about misinformation and disinformation — selectively presented statistics and decontextualized charts are among the most effective vectors for misleading audiences, whether in news or in bad-faith content that mimics news formats. The broader news media landscape continues to reckon with how analytical credibility is established and communicated to audiences with varying statistical literacy.

References