Business
Will the future of work be ethical? Future leader perspectives
In June, TechCrunch Ethicist in Residence Greg M. Epstein attended EmTech Next, a conference organized by the MIT Technology Review. The conference, which took place at MIT’s famous Media Lab, examined how AI and robotics are changing the future of work.
Greg’s essay, Will the Future of Work Be Ethical? reflects on his experiences at the conference, which produced what he calls “a religious crisis, despite the fact that I am not just a confirmed atheist but a professional one as well.” In it, Greg explores themes of inequality, inclusion and what it means to work in technology ethically, within a capitalist system and market economy.
Accompanying the story for Extra Crunch are a series of in-depth interviews Greg conducted around the conference, with scholars, journalists, founders and attendees.
Below he speaks to two conference attendees who had crucial insights to share. Meili Gupta is a high school senior at Phillips Exeter Academy, an elite boarding school in New Hampshire; Gupta attended the EmTech Next conference with her mother and has attended with family in previous years as well; her voice and thoughts on privilege and inequality in education and technology are featured prominently in Greg’s essay. Walter Erike is a 31-year-old independent consultant and SAP Implementation Senior Manager. from Philadelphia. Between conference session, he and Greg talked about diversity and inclusion at tech conferences and beyond.
Greg Epstein: How did you come to be at EmTech Next?
Meili Gupta: I am a rising high school senior at Phillips Exeter Academy; I’m one of the managing editors for my school’s science magazine called Matter Magazine.
I [also] attended the conference last year. My parents have come to these conferences before, and that gave me an opportunity to come. I am particularly interested in the MIT Technology Review because I’ve grown up reading it.
You are the Managing Editor of Matter, a magazine about STEM at your high school. What subjects that Matter covers are most interesting to you?
This year we published two issues. The first featured a lot of interviews from top {AI} professors like Professor Fei-Fei Li, at Stanford. We did a review for her and an interview with Professor Olga Russakovsky at Princeton. That was an AI special issue and, being at this conference you hear about how AI will transform industries.
The second issue coincided with Phillips Exeter Global Climate Action Day. We focused both on environmentalism clubs at Exeter and environmentalism efforts worldwide. I think Matter, as the only stem magazine on campus has a responsibility in doing that.
AI and climate: in a sense, you’ve already dealt with this new field people are calling the ethics of technology. When you hear that term, what comes to mind?
As a consumer of a lot of technology and as someone of the generation who has grown up with a phone in my hand, I’m aware my data is all over the internet. I’ve had conversations [with friends] about personal privacy and if I look around the classroom, most people have covers for the cameras on their computers. This generation is already aware [of] ethics whenever you’re talking about computing and the use of computers.
About AI specifically, as someone who’s interested in the field and has been privileged to be able to take courses and do research projects about that, I’m hearing a lot about ethics with algorithms, whether that’s fake news or bias or about applying algorithms for social good.
What are your biggest concerns about AI? What do you think needs to be addressed in order for us to feel more comfortable as a society with increased use of AI?
That’s not an easy answer; it’s something our society is going to be grappling with for years. From what I’ve learned at this conference, from what I’ve read and tried to understand, it’s a multidimensional solution. You’re going to need computer programmers to learn the technical skills to make their algorithms less biased. You’re going to need companies to hire those people and say, “This is our goal; we want to create an algorithm that’s fair and can do good.” You’re going to need the general society to ask for that standard. That’s my generation’s job, too. WikiLeaks, a couple of years ago, sparked the conversation about personal privacy and I think there’s going to be more sparks.
Seems like your high school is doing some interesting work in terms of incorporating both STEM and a deeper, more creative than usual focus on ethics and exploring the meaning of life. How would you say that Exeter in particular is trying to combine these issues?
I’ll give a couple of examples of my experience with that in my time at Exeter, and I’m very privileged to go to a school that has these opportunities and offerings for its students.
Don’t worry, that’s in my next question.
Absolutely. With the computer science curriculum, starting in my ninth grade they offered a computer science 590 about [introduction to] artificial intelligence. In the fall another 590 course was about self driving cars, and you saw the intersection between us working in our robotics lab and learning about computer vision algorithms. This past semester, a couple students, and I was involved, helped to set up a 999: an independent course which really dove deep into machine learning algorithms. In the fall, there’s another 590 I’ll be taking called social innovation through software engineering, which is specifically designed for each student to pick a local project and to apply software, coding or AI to a social good project.
I’ve spent 15 years working at Harvard and MIT. I’ve worked around a lot of smart and privileged people and I’ve supported them. I’m going to ask you a question about Exeter and about your experience as a privileged high school student who is getting a great education, but I don’t mean it from a perspective of it’s now me versus you.
Of course you’re not.
I’m trying to figure this out for myself as well. We live in a world where we’re becoming more prepared to talk about issues of fairness and justice. Yet by even just providing these extraordinary educational experiences to people like you and me and my students or whomever, we’re preparing some people for that world better than others. How do you feel about being so well prepared for this sort of world to come that it can actually be… I guess my question is, how do you relate to the idea that even the kinds of educational experiences that we’re talking about are themselves deepening the divide between haves and have nots?
I completely agree that the issue between haves and have nots needs to be talked about more, because inequality between the upper and the lower classes is growing every year. This morning, Mr. Isbell from Georgia Tech talk was really inspiring. For example, at Phillips Exeter, we have a social service club called ESA which houses more than 70 different social service clubs. One I’m involved with, junior computer programming, teaches programming to local middle school students. That’s the type of thing, at an individual level and smaller scale, that people can try to help out those who have not been privileged with opportunities to learn and get ahead with those skills.
What Mr. Isbell was talking about this morning was at a university level and also tying in corporations bridge that divide. I don’t think that the issue itself should necessarily scare us from pushing forward to the frontier to say, the possibility that everybody who does not have a computer science education in five years won’t have a job.
Today we had that debate about role or people’s jobs and robot taxes. That’s a very good debate to have, but it sometimes feeds a little bit into the AI hype and I think it may be a disgrace to society to try to pull back technology, which has been shown to have the power to save lives. It can be two transformations that are happening at the same time. One, that’s trying to bridge an inequality and is going to come in a lot of different and complicated solutions that happen at multiple levels and the second is allowing for a transformation in technology and AI.
What are you hoping to get out of this conference for yourself, as a student, as a journalist, or as somebody who’s going into the industry?
The theme for this conference is the future of the workforce. I’m a student. That means I’m going to be the future of the workforce. I was hoping to learn some insight about what I may want to study in college. After that, what type of jobs do I want to pursue that are going to exist and be in demand and really interesting, that have an impact on other people? Also, as a student, in particular that’s interested in majoring in computer science and artificial intelligence, I was hoping to learn about possible research projects that I could pursue in the fall with this 590 course.
Right now, I’m working on a research project with a Professor at the University of Maryland about eliminating bias in machine learning algorithms. What type of dataset do I want to apply that project to? Where is the need or the attention for correcting bias in the AI algorithms?
As a journalist, I would like to write a review summarizing what I’ve learned so other [Exeter students] can learn a little too.
What would be your biggest critique of the conference? What could be improved?
-
Entertainment7 days ago
‘Interior Chinatown’ review: A very ambitious, very meta police procedural spoof
-
Entertainment6 days ago
Earth’s mini moon could be a chunk of the big moon, scientists say
-
Entertainment6 days ago
The space station is leaking. Why it hasn’t imperiled the mission.
-
Entertainment5 days ago
‘Dune: Prophecy’ review: The Bene Gesserit shine in this sci-fi showstopper
-
Entertainment4 days ago
Black Friday 2024: The greatest early deals in Australia – live now
-
Entertainment3 days ago
How to watch ‘Smile 2’ at home: When is it streaming?
-
Entertainment3 days ago
‘Wicked’ review: Ariana Grande and Cynthia Erivo aspire to movie musical magic
-
Entertainment2 days ago
A24 is selling chocolate now. But what would their films actually taste like?