DataKind’s Founder & Executive Director, Jake Porway, and DJ Patil recently caught-up for a chat. Here’s some of what they spoke about…
A little about DJ…
- Husband, dad, data maverick
- Chaos Theory mathematician
- Served in 2 Administrations on domestic and national security
- US Chief Data Officer 2015-17 appointed by President Obama
- Innovating healthcare industry at Devoted Health
Jake: What’s the “WHAT IF?” that inspires the work you do?
DJ: Going into the White House I had a bit of time to reflect on what mattered the most to me and how I’d ground myself each day. And the mission I came up with for myself was to make the world better for our children, and our children’s children.
It’s a powerful lens to prioritize your time and led me to meet all kinds of incredible people From inmates who are working to make themselves better fathers and mothers, to members of the armed service who put their lives on the line every day to make sure that we can sleep safe at night, to people who are working to find a new way to connect with another human in need; they all are doing incredible things.
I can’t help but think about how we can scale them. Imagine if we could take their projects and insights and scale them 100x? Think about the force multiplier that this would be for the nation and the world! We all know that data and technologies are part of the equation. In many ways that’s why so many people have contributed to the mission of DataKind. We’re all on the same mission: to make this world a better place for our children and our children’s children.
Jake: AI – lots of hope or lots of hype?
DJ: There is incredible opportunity and also serious concerns that need to be addressed. AI offers one of the great ways to enable drug discovery and tailored medical treatments. Similarly, it can offer phenomenal productivity increases. We need to address the difference between generalized AI, which I’m not concerned about in the near term, and specific AI which has uses that are very concerning.
We have to ask, how do we use this kind of technology responsibly? For example, if we’re using AI/data science/machine learning in predictive policing, what does this mean given that the training data is racially biased? How do we evaluate “fairness”?
As the adage goes, “with great power, comes great responsibility”. As technologists, we haven’t figured out what it means to be responsible with these technologies. And now is the time to do so. That’s why it’s so critical anyone who works with data must have their training curriculum include both ethics and security (fully integrated into their curriculum not just an elective). Just because we can with data, doesn’t mean we should.
Jake: What’s the best example you’ve seen (or hope to see) of data science or machine learning being used for social good?
DJ: I love the work of data scientists from the University of Chicago Data Science for Good Fellowship Programs who worked to understand the “factors” of when officers are going to excessively use force. In the modeling, they realized that there were features such as the officer just dealt with a suicide or a case of domestic violence where a child was present. In the process, they realized that the dispatch system doesn’t give the officer time to emotionally decompress. And as a result, they’re now building a “smart” dispatch system that will take these issues into account.
Jake: What are you most excited about right now?
DJ: I’m most excited about the youth who are speaking up and speaking out about the injustices of the world. This includes the youth taking my generation to task about gun control and the data scientists who are finding new insights in data on climate change and the opioid epidemic. Their unrelenting passion to make the world better for our children and our children’s children is just…well, awesome!