Disclaimer: Posts are solely the views of the author and do not represent the views of Brandeis University or the Institute for Economic and Racial Equity.
“The main question for our technological future becomes, ‘Whose technology are we using and for what cause?’”
Technology’s impact stood at my periphery, where the joys of late-night YouTube binges and the unfettered curiosity stoked by social media sites like Facebook that lent my generation cultural artifacts for ages failed to reach me. The separation from early internet culture was not due to personal inclination, but rather a consequence of distance.
I grew up as the child of civil servants. During the most formative years of early aughts’ internet abandon, I lived in one of the most heavily guarded and militarized Naval bases in American history: Guantanamo Bay, Cuba. While a certain level of privilege and sacrifice accompanied my early teen years as a Naval brat, it also lent me a specific lens through which I presently interact with and perceive the exponential growth of technology. To send a single email back then required a visit to the lone internet café on site, where the hours of operation were minimal and internet availability even more so.
I write about my upbringing around technology to note that my view of technology’s role in social behavior and political movement is markedly different from members of my generation. While it was not financial adversity that led me to a relatively technology-free upbringing—simply the reality of growing up a tangential member of the intelligence community—I nevertheless acquired a different perspective on the role of technology in modern society.
The Problem With Our Technological Development Pipeline
Since the birth of Web 1.0, the first generation of the world wide web, extensive progress has occurred in record time. Discussions of Mark Zuckerberg’s “metaverse” are actualizing the Y2K cyberpunk dystopia of Neal Stephenson’s “Snow Crash,” all while cypherpunks forge our new digital crypto economy as imagined by Jude Milhon.
As we careen toward a new generation of Web 3.0, digital rights and privacy are being eroded in the name of technological progress. During the launch of Web 2.0, the current iteration of our internet, technology became ubiquitous, gameable, and easier to use. However, due to technology’s ubiquity, gameable user experience (UX), and ease of user design (UI), there exists a widening gap in constructing a progressive future for technology.
The challenge of creating equitable forms of technology is multi-pronged: (i) data collection, surveillance, and privacy are viewed as an inevitable trade-off for the use of technology, (ii) a majority of American adults 18 years and older do not understand what happens to their collected data, and (iii) a majority of American adults 18 years and older lack an understanding of basic cybersecurity.
The main question for our technological future becomes, “Whose technology are we using and for what cause?”
Modern data collection starts with running an application programming interface (API). An API acts as a middleman framework from which third-party developers or researchers can extract specific values from an application. APIs are commonly used as data scraping tools to obtain data such as user behavior and location data.
Data collection is often obfuscated by gamified UX, where notifications are used to entice end-users to interact with the application more. The desire to generate likeable content online provides an almost never-ending cycle of data. Where the conversation on data collection and technological progress splinters is on the utilization of the data.
Once collected, data is usually sold to various companies, used to enhance UX, or, in rare cases, used for research internally and not sold. The problem resides in probing the question of whether technology companies should build more attractive applications to bring in more capital or build more private, functional, and secure applications. Routinely, the former is chosen.
Racial Equity Within the Tech Industry
“It is no surprise that the tech industry has a race problem.”
With a large emphasis on creating more capital and efficiencies in modern life, technology fails to provide truly accessible equity. In addition to favoring capital over privacy, the technologists who develop, test, run, and deploy these technologies are homogeneous. It is no surprise that the tech industry has a race problem. For example, African Americans comprise 7.4% of the high-tech industry compared to 14.4% of African Americans who work in private industries outside of high-tech.
In a 2014 Equal Employment Opportunity Commission (EEOC) report, the high-tech sector employs a larger share of whites than Asian Americans, African Americans, and Hispanics and more men than women. In addition to the smaller share of underrepresented minorities (URMs) working for tech companies, little data exist on the retention, promotion, and inclusion of individuals with disabilities, LGBTQIA+ individuals, and mixed-residency individuals. Many objections point to the reluctance of URMs to join the STEM field. However, this logical fallacy fails to address the institutional barriers and discrimination that diminish the appeal of working in an environment hostile to a URM’s very being.
“Technology will never be neutral because it is informed by an individual’s lived experience.”
The issue with the lack of data is also compounded by the lack of voices we hear from the URMs who work in tech. Technology will never be neutral because it is informed by an individual’s lived experience. Most of our technological systems are a direct reflection of the individuals who construct our new technologies. This is not merely an observation, but a real facet of working in technology. My lived experience informs my research on zero knowledge proofing techniques or privacy enhancing technology (PETs) because I believe in a distributed, private networking future.
If we are to have distributive justice within technology, an implementation of a truly distributed and equitable network of individuals is needed. How we achieve this does not rely solely on the innovation of secure and private products, but also on the sustained retention and promotion of “othered” bodies. Within the technology sector this can look like distributive justice in the form of pay equity for URMs in accordance with a white man’s salary, parental leave policies that grant new parents six months of paid time off, and termination or probation for employees who discriminate against or harass other employees. It will take work to create the systems we wish to see, but it’s the right type of innovation needed for a more secure, private, and equitable future.