Analyzing a database of over 235 million data points, we investigated how apps track millions of consumers' daily movements. This story ran on the front page of the New York Times print edition on Monday, December 10, 2018. It was also featured on the podcast The Daily.
An investigation into the fake-view ecosystem on YouTube uncovered that sellers are collecting millions of dollars while others feel defrauded. Meanwhile, the company seems powerless to stop it, despite testimony to Congress that it can reliably detect fake views on their platform. This story ran on the front page of the New York Times print edition on Sunday, August 12, 2018.
As we put more and more of ourselves online and into our devices, what are the questions smart consumers should be asking? I produced this graphic novel with award-winner non-fiction cartoonist Josh Neufeld as a field guide to our new digital condition. Michael and Josh spoke about the story with PRI's The World, FastCompany and others. This story won an Editor and Publisher award in the Innovation in Storytelling category.
With technology becoming a bigger portion of daily life, we showed how the biggest tech companies are competing to own every hour of it.
When a company's security is compromised, 47 states require companies to notify their customers. What counts as "personal data" vary widely, however.
I reverse-engineered iOS spell check to see what words it would not offer to spelling suggestions for including "abortion," "rape," and "virginity" while correcting similarly potentially-controversial words like "suicide," and "marijuana".
The country's supply of voting machines is aging all at once, and few jurisdictions have the funds to create new ones in time for the next general election. I buit an interactive map where readers could look up the voting machines used in their county and the potential problems with them.
Putting the abstract number of the Syrian refugee crisis into context, readers could enter their address and see how much space that many people would fill up. I also created the ability for readers to screenshot and share their view of the interactive with a comment, which I open-sourced as its own library called Banquo. I wrote about how it works for the Tow Center for Digital Media at Columbia University and a more technical walkthrough on Banquo for Source.
Following Hurricane Harvey, we investigated why flood damage seemed to often fall outside of FEMA-designated flood zones. Our investigation surfaced unseen data showing that some maps are over forty years old, showed methodological flaws used to create these maps and highlighted political lobbying that delayed the drawing of new maps.
With Trump weighing a tariff on steel and aluminum imports under a little-used "Section 232" authority, we compiled all previous uses of this authority along with market data to show the effects of such a move and why its political and economic reasoning were not supported by historical or market precedents.
This project won a 2016 Online News Association Award in the Breaking News (large organization) category. I had been scraping the exact speeds and locations of Amtrak trains for over a year when Amtrak 188 crashed on a curve outside Philadelphia. Sooner than any other news organization, we were able to figure out its exact speed, 106.22mph — over twice the limit — as it sped into the curve.
In order to give readers a sense of the lived experience of those affected by the current court battle over abortion rights, we created a series of maps that would show the before and after of women living in the state.
Following Trump’s election, we maintined an up-to-date list of his business conflicts.
After President Trump fired FBI director James Comey, we put together an annotated timeline showing the topics of investigation and stated reasons for dismissal of the two other people that were investigating Trump that were also removed from their position.
We collected media reports, research and detainees' handwritten letters to give context to the recent hunger strikes at Guantanamo Bay. We also catalogued the camp's population over time by extracting dates from hundreds of government documents, which had not been done in the same way before.
Using data from Treasury.io and the Sunlight Foundation's Capitol words project, I created a quick chart to show how the debt ceiling has only recenly become a debated issue in Congress.
As the non-profit / foundation-funded model of newsrooms becomes more prevalent, measuring beyond pagesviews is more and more a pressing question. NewsLynx was a research project funded through the Tow Center for Digital Journalism to a) study how newsrooms were currently measuring the impact of their work and b) to develop a proof of concept platform to help these organizations better wed the qualitative and quantiative sides of impact. We produced a white paper and released our code under an open-source license. The project's debut was covered by Nieman Lab and others.
For the graphic novel Terms of Service, we wanted to make sure the story was easily readable across devices and on different platforms such as iTunes Store and Google Play. To do this, I designed, built and open-sourced a web comic viewer called Pulp and a builder interface called Pulp Press. The project has been used by independent comic creators as well as the San Francisco Public Press and the BBC Magazine. This project was a finalist for a 2015 Online Journalism Award in the Innovation category.
With a group of other journalists, developers and data scientists, I built and open-sourced a first-of-its-kind data feed for how the U.S. government spends and takes in money on a daily basis. The project has been used in stories by Al Jazeera America, Time and others. The project was funded through a Knight-Mozilla Code OpenNews Sprint grant.
For the past eight years or so, I've co-organized a weekly meetup and group of journalists / developers. The rotating collection of folks has led to a number of small scale, informal projects and larger ones such as Treasury.io, Big Data: the Board Game and NewsLynx.
As a way to make more structured interviews and create a tool to introduce reporters to programming tools, I developed a language syntax and syntax highlighter to conduct interviews.
In our newsroom, we've made a number of small helper tools for things like CMS-friendly Document Cloud embeds or using Quartz's Chartbuilder. The problem is that remembering the URLs for these tools gets tiresome for folks. I developed Aufbau as a kind of iframe for web apps but for a desktop apps. It lets you connect any number of tools and have it all live in the easily-accessible dock.
I developed the deployment and publishing process for our digital projects at Al Jazeera America that allows for versioned, scheduled and collaborative deployments.