Ethical Tech

All Tech is Human Interviews Carey Jenkins, our CEO, on Being an Intentional User and Innovator of Technology

Author

Andrew Sears, All Tech is Human

All Tech Is Human has an ambitious goal of altering the process of how technologies are developed and deployed. They create a more thoughtful future towards technology by making it more inclusive, multidisciplinary, and participatory. They are busy building a dedicated team of advisors, a broad range of experts we rely on for advice, and a network of volunteers throughout the globe. All Tech is Human spoke with Carey about what individuals and companies can do to be responsible stewards of digital technology.

All Tech is Human (ATIH): How and why did you first become passionate about the intersection of ethics and technology?

Carey: I became CEO of a technology company that has been designing and building digital products for start-ups and enterprises since 2006. Right as privacy concerns and regulatory efforts were becoming mainstream conversations, my 5 year old was becoming obsessed with screens and apps. Both in my work and my life I was facing a constant drumbeat of questions about the role of technology in our lives.

It’s becoming increasingly obvious that allowing the biggest technology companies in the world to determine our boundaries, as citizens, parents and businesses, is a mistake we still have time to correct. We shouldn’t think of monetization of our data or lost privacy or alternative facts as inevitable. Progress, advances, breakthroughs, and surprises are inevitable, but the way we prepare, interact, direct, create, learn - that is of our choosing. At Substantial, we have the power to be specific, transparent and protective product creators and technologists. I don’t take the responsibility lightly.

What principles do you use to guide your digital technology use in your own life?

Moderation and intention.

There are some platforms I’ve quit entirely and some I’ve reduced greatly. At some point, I realized that I can’t devote time to nurture an online presence or to participate deeply in online conversation. My feeds were full of pictures of my daughter, who at almost seven, will soon become aware of social media and should be afforded some consent. I’m relieved to no longer feel obligated to social media, even though I miss being a part of it sometimes. That is not to say that other people shouldn’t or don’t get anything positive out of social media. The key is to make sure the positive you get out of a platform outweighs the negative impact on you and the world.

There are very simple apps I use daily that truly make my life better - ordering groceries, scheduling workouts, managing my finances, investing, travel booking. Digital products are a huge part of managing my life and information - I can still fall into a twitter spiral with the best of them. Again, the key is to regularly ask myself if the positive I’m getting out of an app outweighs the negative.

For my daughter, we choose what she has access to carefully and then give her some freedom within that subset. Solely limiting her screen time has not reduced her interest in technology at all - in fact I think its made it even more compelling to her because it seems special and somewhat forbidden. We are slowly trying to increase her awareness within our oversight and teaching her how to make good choices about what we value. We try to spend lots of time on activities that are not digital and we let her feel boredom without defaulting to giving her a screen.

I’m mostly striving for balance. There are days I’m on my phone too much, weeks when I realize my newsfeed is making me crazy, weekends my daughter gets more iPad than I would like. For me, the important thing is to make a choice every single day, not just do something out of habit.

Over the past few years, have you seen your clients grow more concerned with understanding the potential negative consequences of their innovations? Why do you think that is?

Honestly, not as much as I would like. But we do feel more comfortable as their partner in bringing issues to light and asking hard questions. And we consider it our responsibility to help our clients envision a successful digital experience that is worthwhile, transparent, and ethical.

It’s easy to underplay implications when a product is in the early stages. We see now why the technology giants have the power they have but they all started small, with simple ideas, that turned into something else entirely. Technology does not evolve in a straight line; it evolves in all directions. What we offer and learn from the smallest audience is sitting at the center of an idea that could go anywhere.

In the end, people want their business to succeed, to secure more funding, to meet their goals, and success is dependent on engagement and growth. It’s our job to show how to achieve high engagement and growth by creating something ethical and valuable that their customers will love.

What can people in innovation roles do to anticipate and prevent possible negative consequences of the technology they're building?

Have the hard conversations. Even experts are taken by surprise at the rate at which technology can evolve. Data collected for reasons that can seem entirely above board can be aggregated in ways that are invasive. What had to shift was the default idea that you capture all data and figure out what to do with it later. And honestly, we will still need regulatory help on that front. I’m not making excuses - I’m just saying that when you think about the implications of your product, you have to account for where the technology could evolve and be intentional and transparent around your boundaries before, during, and after.

What I have seen is tradeoffs being made when the original, simpler, and typically more customer-centric monetization strategy doesn’t succeed at the rate the business hoped. That’s when alternative monetization strategies, and frankly, ideas that usually bring less value to the customers, start getting prioritized. Be wary of decisions that are worth nothing to the customer now but could “pay off” down the road. Be wary of a growth strategy that is too aggressive for your product to make a meaningful foothold in your customers’ need cycle.

Is there an issue in the tech ethics space that you feel isn't getting enough attention? Why do you think it's important?

The environment in which product creators are working, at the macro and micro, affects their perception of ethics, transparency, and responsibility. If you take the largest technology companies and start-ups and compare their environments there are interesting similarities:

  • Large percentage of equity in compensation
  • Extremely competitive atmosphere
  • Huge growth and high turnover
  • Constant reorgs and leadership changes
  • Low diversity of employees
  • High rate of harassment and troubling behavior

This is the environment in which the products we use every single day are created - products that are changing our economy, our attention, our elections, our education, finance and health systems, and how we communicate and connect. We seemed to have decided that is the only environment to grow at the magnitude expected for VC funding and IPOs, for attention and cache.

I would argue those environments foster watered down customer focus, incentivization of extreme growth at all costs, siloed transparency and ethics, and little deterrent to decisions with longer term negative consequences. It degrades creativity and innovation, and either burns through talented product people or traps them with golden handcuffs. And more to the point, it has created some of the thorniest equical quandaries of our time.

I consider it a personal mission to lead a technology company that treats our employees, the environment in which we work, our clients, and their customers with care, consideration, and commitment. And I am increasingly hopeful for every conversation, article, conference, and book about the positive force technology can be and how to make that our reality.

In what areas of business, society, or human life in general do you see the greatest potential for digital technology to do good?

Mobile learning and curriculum in developing countries, especially for girls and women who are often left behind by conventional education opportunities, is an area I think digital technology can make a substantial impact. Digital education in developing countries offers access to localized curriculum and guidance and support for teachers and students. It fosters foundational digital skills, human rights awareness, and access to career paths.

Women are overrepresented in job sectors expected to be automated. Without increased digital fluency and closing the gender gap on education and STEM careers, the world is missing out on trillions in human capital and women are missing out on their potential to join the workforce, earn an income, and be a part of the innovation cycles changing their communities and the world.

Improving access to the internet, mobile devices, education, and job placement, all areas that women fall woefully behind men in the developing world, remains challenging. But there are many special companies stepping up to those challenges. Learning Equality is creating offline learning resources, while BRCK is working specifically to address access issues in education in the developing world. And I am particularly excited to call out a couple female founded companies, Cell-Ed and Odetta.ai, that are creating platforms that specifically focus on access to adult education and work opportunities for women in technology. Lastly, I would encourage readers to learn more about the Center for Human Technology - an organization committed to the positive force technology can be and working to create advocacy, education, and change.

Let’s build a better future, together.