BY MICHAEL KIBEDI
N/A-A
N/A-A (2023) is a larger-than-life installation made of perforated punch cards by Merve Mepa, an artist working at the intersection of material forms, cultural science, and computing culture. Each punch card has small holes, stitched together in a double-track and suspended from the ceiling of a gallery space. Mepa describes the algorithmically produced patterns as “reproducing itself according to neighbourhood relations in matrix systems, [becoming] a form of reproduction capable of weaving itself.” Despite the aesthetic and textural appeal, we must not lose sight of its utility.
Students of computing history will recognise punch cards as objects containing instructions required for early calculation engines. In the early twentieth century, a “computer” was a title given to women who translated requirements into the sequence of code represented by the holes seen on these punch cards.
We work with electronic devices capable of computing instructions of far greater complexity, however, we still rely on binary representation to capture data that exists in the world — a point argued compellingly by James Bridle in Ways of Being. Despite computer data being used to generate compelling images or operate complex predictive systems, it does not change its inherent limitations. Our data are abstractions of reality constructed from vast arrays of binary data: zero or one, true or false, on or off.
Fast forward to the twenty-first century and our understanding of interdependence in the natural world has advanced to such an extent, that we realise that the mental models we cling to forces binary thinking onto ecosystems where few exist.
Nature is littered with examples of plant and fungal life that defy binary categorisation. So what does it mean to deconstruct the binaries that form our data architectures which are often imposed as a result of our worldviews? Despite the richness of their descriptiveness, our data ultimately lacks the properties to respond to the unclarity of the lives they attempt to encapsulate.
There are several ways to interpret the deconstruction of the binary. This is partly a question of considering how we use data to address what Simone Browne terms the “ontological conditions of [our] blackness” (pg. 8) — how we capture the essence of a person’s existence in data. If our existence is reduced to a set of attributes, then a techno-optimist response frames the problem as one of missing detail — increase what we capture to improve the descriptive clarity of the subject. This is an approach that Meredith Broussard neatly trounces:
“When you write the kind of computer programs that slot people into neat categories in order to do data analysis, there is a tension between people's messy, shifting identities in the "real" world that rubs up against the sleek empiricism required to do the math that is under the hood in computers. This is most obvious when it comes to the gender binary and binary representation in computer systems.” — More Than A Glitch: Confronting Race, Gender, and Ability Bias in Tech, pg. 107
Multiplying the volume and granularity of our data is not the answer. Arguing whether we strive to capture the essence of personhood in a hundred, a thousand, or a million attributes misses the point. It is the thinking behind these actions — the techno-optimist belief that the messy entanglement of human identity can be reduced to a metaphorical pixelated data landscape — that is so problematic.
It is this wrong-headed thinking that extends the tendrils of smart, predictive decision-making solutions to assist in the administration of life-changing decisions — the veracity of an asylum claim based on an applicant’s dialect, the “wrong” sentiment detected in a job applicant’s voice, the likelihood of your marriage application being selected for scrutiny based on your nationality — most often these punitive steps land on the most vulnerable in our societies. Fraud detection within social welfare is a common arena for such solutions to be deployed, with recent examples wreaking havoc in Denmark and the Netherlands.
For now, we must look past the data and focus on the thinking that instructs its collection. When the vendors of smart solutions claim to be able to calculate the probability of future behaviours based on a set of defined data attributes — many of which are intrinsic or unchangeable; and when the attributes are insufficient to burnish the perception of the system’s intelligence — then ever more esoteric data are sourced to fill these gaps. In the Netherlands, where a welfare fraud risk system was deployed in Rotterdam:
“[The welfare fraud risk score algorithm] judges people on many characteristics they cannot control (like gender and ethnicity)… The data fed into the algorithm ranges from invasive (the length of someone’s last romantic relationship) and subjective (someone’s ability to convince and influence others) to banal (how many times someone has emailed the city) and seemingly irrelevant (whether someone plays sports).” — Inside the Suspicion Machine
Far from being a data quality or data science problem, we ought to treat the misrepresentation of personhood as an ontological concern — how are we producing existence in data? We should begin by interrogating the thinking that considers personhood as subjects to fit into rigid taxonomies. We need to question if the interiority of our being can (or rather, should) ever be appraised by an algorithm.
Our belief in datafication as a way of producing existence not only signals clumsy ontological capture, but it results in the elision of a person’s fullness of being. Patricia Hill Collins (citing Zuleyma Tang Halpin) summarises this emphatically in Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment that:
“… each term in the binaries white/black, male/female, reason/emotion, culture/nature, fact/opinion, mind/body, and subject/object gains meaning only in relation to its counterpart” — Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment, pg. 19
Collins, here, is concerned with examining the function of oppressive systems — a context different but not unrelated. In this light, it necessitates having a suspicion of how oppositional binaries originate, and the oppressive power structures they hide.
Oppression functions as an ecosystem. It is a short step to recognise that taking an interest in how human and non-human entities interrelate within environmental ecosystems is not unrelated. Such parallel realms are crucial for us all to ponder. The flora, fauna, and fungi remind us of the sometimes messy, sometimes contradictory ways we co-exist. If the birds and fungi can be insurgents within the taxonomies imposed upon them, what does that tell us about the lengths we go to resist the datafication of our own kind?
A surface-level concern about the discriminatory outcomes of predictive or algorithmic decisioning systems may spur us on to advocate for creating more descriptive data to “fix” the problem1, but look closer. The histories, ideologies and power structures that equates existence with a growing array of data attributes — existing individually as binary markers, interpreted collectively as a summation of what a datafied subject might think, say, or do in the future — are aims that overlook the reality of the fluidity, unknowing and ontological instability of each subject.
“… oppositional binaries rarely represent different but equal relationships, they are inherently unstable. Tension may be temporarily relieved by subordinating one half of the binary to the other. Thus Whites rule Blacks, men dominate women, reason is thought superior to emotion in ascertaining truth, facts supersede opinion in evaluating knowledge, and subjects rule objects.” — Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment, pg. 71
The binaries embedded in our systems, and from which so much of our language derives its descriptive power, are remnants of exclusionary power structures that have determined the boundaries of inclusion and exclusion.
“... the recursive features of our Western imaginary are reliant on the reinstatement of logical "truths" made actionable by systems of categorization” — The Black Technical Object, pg. 16
Ultimately data attributes form their subjects into categories — because categorisation is political. The ability to impose a worldview by defining how data and its attributes are to be partitioned demonstrates the possession of power that can bespeak the acceptance of subjects aligned to a dominant power structure while rendering unknown other subjects that trouble the foundations of these logics.
Blackness troubles the calculative gaze. Our technological desire to know the datafied subjects is spurred by a belief that the future can be known from the data of the past. The resemblance to fortune telling and divination is not accidental, particularly when the inner workings of some algorithmic decisioning systems are scrutinised. In Blackness, we become dark matter, to borrow Simone Browne’s words; a disruptive presence in an exclusionary technological landscape: “… when dark matter troubles algorithms in this way, it amounts to a refusal of the idea of neutrality when it comes to certain technologies” (pg. 162).
The view from the margins often elicits the greatest clarity. Data feminism’s fourth principle informs us of the importance of rethinking binaries. Layered over Patricia Hill Collin’s insight, we can go further to consider the binaries that we struggle against are disguising the oppositional and oppressive structures, artificially simplifying them to uphold a dominant worldview.
Ontological capture should become our concern. This is a phrase I am experimenting with. I think it neatly summarises the concern of a digital dragnet forming as we deploy smarter AI to address sociotechnical concerns, often with life-changing outcomes; and questions why we seek to capture the essence of a person’s being in data so each body is made calculable.
“The call is a broader and more fundamental one that recognises the mutual, unfolding enactments of ordering, classifying, producing and ultimately designing technology. This collapses the us-them, human-machine, inside-outside binaries and allows us to see technology and its design not as a recapitulation of disciplinary tropes or tidy conceptual categories, but as a means of participating in unfolding ways of knowing, being and doing.“ — Out There
Thinking past binaries means letting go of our industry’s dogma that societal problems are questions that can only be answered with technological solutions. Thinking past binaries means embracing the unknowing and accepting discomfort as a settled state — which exists in contrast to the obsession with capturing the world in data to make its subjects knowable, searchable, and examinable as datafied objects.
IMAGE CREDITS
N/A-A(2023) by Merve Mepa is a sculptural work that asks searching questions on the nature of agency, predestination, and control in computerised environments.
Detail from Numbers and Faces: Multi-Racial/Ethnic Combinations Series 1: Face #11, Martina Crouch (Nigerian Igbo Tribe/White) (2020) by Charles Gaines exhibited in 2021 at the show “Multiples of Natures, Trees and Faces” (Hauser & Wirth, London); a series that questions the political and cultural ideas that shape our understanding of multi-racial identity and the way we produce difference in data
N/A-A
N/A-A (2023) is a larger-than-life installation made of perforated punch cards by Merve Mepa, an artist working at the intersection of material forms, cultural science, and computing culture. Each punch card has small holes, stitched together in a double-track and suspended from the ceiling of a gallery space. Mepa describes the algorithmically produced patterns as “reproducing itself according to neighbourhood relations in matrix systems, [becoming] a form of reproduction capable of weaving itself.” Despite the aesthetic and textural appeal, we must not lose sight of its utility.
Students of computing history will recognise punch cards as objects containing instructions required for early calculation engines. In the early twentieth century, a “computer” was a title given to women who translated requirements into the sequence of code represented by the holes seen on these punch cards.
We work with electronic devices capable of computing instructions of far greater complexity, however, we still rely on binary representation to capture data that exists in the world — a point argued compellingly by James Bridle in Ways of Being. Despite computer data being used to generate compelling images or operate complex predictive systems, it does not change its inherent limitations. Our data are abstractions of reality constructed from vast arrays of binary data: zero or one, true or false, on or off.
Fast forward to the twenty-first century and our understanding of interdependence in the natural world has advanced to such an extent, that we realise that the mental models we cling to forces binary thinking onto ecosystems where few exist.
Nature is littered with examples of plant and fungal life that defy binary categorisation. So what does it mean to deconstruct the binaries that form our data architectures which are often imposed as a result of our worldviews? Despite the richness of their descriptiveness, our data ultimately lacks the properties to respond to the unclarity of the lives they attempt to encapsulate.
There are several ways to interpret the deconstruction of the binary. This is partly a question of considering how we use data to address what Simone Browne terms the “ontological conditions of [our] blackness” (pg. 8) — how we capture the essence of a person’s existence in data. If our existence is reduced to a set of attributes, then a techno-optimist response frames the problem as one of missing detail — increase what we capture to improve the descriptive clarity of the subject. This is an approach that Meredith Broussard neatly trounces:
“When you write the kind of computer programs that slot people into neat categories in order to do data analysis, there is a tension between people's messy, shifting identities in the "real" world that rubs up against the sleek empiricism required to do the math that is under the hood in computers. This is most obvious when it comes to the gender binary and binary representation in computer systems.” — More Than A Glitch: Confronting Race, Gender, and Ability Bias in Tech, pg. 107
Multiplying the volume and granularity of our data is not the answer. Arguing whether we strive to capture the essence of personhood in a hundred, a thousand, or a million attributes misses the point. It is the thinking behind these actions — the techno-optimist belief that the messy entanglement of human identity can be reduced to a metaphorical pixelated data landscape — that is so problematic.
It is this wrong-headed thinking that extends the tendrils of smart, predictive decision-making solutions to assist in the administration of life-changing decisions — the veracity of an asylum claim based on an applicant’s dialect, the “wrong” sentiment detected in a job applicant’s voice, the likelihood of your marriage application being selected for scrutiny based on your nationality — most often these punitive steps land on the most vulnerable in our societies. Fraud detection within social welfare is a common arena for such solutions to be deployed, with recent examples wreaking havoc in Denmark and the Netherlands.
For now, we must look past the data and focus on the thinking that instructs its collection. When the vendors of smart solutions claim to be able to calculate the probability of future behaviours based on a set of defined data attributes — many of which are intrinsic or unchangeable; and when the attributes are insufficient to burnish the perception of the system’s intelligence — then ever more esoteric data are sourced to fill these gaps. In the Netherlands, where a welfare fraud risk system was deployed in Rotterdam:
“[The welfare fraud risk score algorithm] judges people on many characteristics they cannot control (like gender and ethnicity)… The data fed into the algorithm ranges from invasive (the length of someone’s last romantic relationship) and subjective (someone’s ability to convince and influence others) to banal (how many times someone has emailed the city) and seemingly irrelevant (whether someone plays sports).” — Inside the Suspicion Machine
Far from being a data quality or data science problem, we ought to treat the misrepresentation of personhood as an ontological concern — how are we producing existence in data? We should begin by interrogating the thinking that considers personhood as subjects to fit into rigid taxonomies. We need to question if the interiority of our being can (or rather, should) ever be appraised by an algorithm.
Our belief in datafication as a way of producing existence not only signals clumsy ontological capture, but it results in the elision of a person’s fullness of being. Patricia Hill Collins (citing Zuleyma Tang Halpin) summarises this emphatically in Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment that:
“… each term in the binaries white/black, male/female, reason/emotion, culture/nature, fact/opinion, mind/body, and subject/object gains meaning only in relation to its counterpart” — Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment, pg. 19
Collins, here, is concerned with examining the function of oppressive systems — a context different but not unrelated. In this light, it necessitates having a suspicion of how oppositional binaries originate, and the oppressive power structures they hide.
Oppression functions as an ecosystem. It is a short step to recognise that taking an interest in how human and non-human entities interrelate within environmental ecosystems is not unrelated. Such parallel realms are crucial for us all to ponder. The flora, fauna, and fungi remind us of the sometimes messy, sometimes contradictory ways we co-exist. If the birds and fungi can be insurgents within the taxonomies imposed upon them, what does that tell us about the lengths we go to resist the datafication of our own kind?
A surface-level concern about the discriminatory outcomes of predictive or algorithmic decisioning systems may spur us on to advocate for creating more descriptive data to “fix” the problem1, but look closer. The histories, ideologies and power structures that equates existence with a growing array of data attributes — existing individually as binary markers, interpreted collectively as a summation of what a datafied subject might think, say, or do in the future — are aims that overlook the reality of the fluidity, unknowing and ontological instability of each subject.
“… oppositional binaries rarely represent different but equal relationships, they are inherently unstable. Tension may be temporarily relieved by subordinating one half of the binary to the other. Thus Whites rule Blacks, men dominate women, reason is thought superior to emotion in ascertaining truth, facts supersede opinion in evaluating knowledge, and subjects rule objects.” — Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment, pg. 71
The binaries embedded in our systems, and from which so much of our language derives its descriptive power, are remnants of exclusionary power structures that have determined the boundaries of inclusion and exclusion.
“... the recursive features of our Western imaginary are reliant on the reinstatement of logical "truths" made actionable by systems of categorization” — The Black Technical Object, pg. 16
Ultimately data attributes form their subjects into categories — because categorisation is political. The ability to impose a worldview by defining how data and its attributes are to be partitioned demonstrates the possession of power that can bespeak the acceptance of subjects aligned to a dominant power structure while rendering unknown other subjects that trouble the foundations of these logics.
Blackness troubles the calculative gaze. Our technological desire to know the datafied subjects is spurred by a belief that the future can be known from the data of the past. The resemblance to fortune telling and divination is not accidental, particularly when the inner workings of some algorithmic decisioning systems are scrutinised. In Blackness, we become dark matter, to borrow Simone Browne’s words; a disruptive presence in an exclusionary technological landscape: “… when dark matter troubles algorithms in this way, it amounts to a refusal of the idea of neutrality when it comes to certain technologies” (pg. 162).
The view from the margins often elicits the greatest clarity. Data feminism’s fourth principle informs us of the importance of rethinking binaries. Layered over Patricia Hill Collin’s insight, we can go further to consider the binaries that we struggle against are disguising the oppositional and oppressive structures, artificially simplifying them to uphold a dominant worldview.
Ontological capture should become our concern. This is a phrase I am experimenting with. I think it neatly summarises the concern of a digital dragnet forming as we deploy smarter AI to address sociotechnical concerns, often with life-changing outcomes; and questions why we seek to capture the essence of a person’s being in data so each body is made calculable.
“The call is a broader and more fundamental one that recognises the mutual, unfolding enactments of ordering, classifying, producing and ultimately designing technology. This collapses the us-them, human-machine, inside-outside binaries and allows us to see technology and its design not as a recapitulation of disciplinary tropes or tidy conceptual categories, but as a means of participating in unfolding ways of knowing, being and doing.“ — Out There
Thinking past binaries means letting go of our industry’s dogma that societal problems are questions that can only be answered with technological solutions. Thinking past binaries means embracing the unknowing and accepting discomfort as a settled state — which exists in contrast to the obsession with capturing the world in data to make its subjects knowable, searchable, and examinable as datafied objects.
IMAGE CREDITS
N/A-A(2023) by Merve Mepa is a sculptural work that asks searching questions on the nature of agency, predestination, and control in computerised environments.
Detail from Numbers and Faces: Multi-Racial/Ethnic Combinations Series 1: Face #11, Martina Crouch (Nigerian Igbo Tribe/White) (2020) by Charles Gaines exhibited in 2021 at the show “Multiples of Natures, Trees and Faces” (Hauser & Wirth, London); a series that questions the political and cultural ideas that shape our understanding of multi-racial identity and the way we produce difference in data
Michael Kibedi is a design researcher and writer of First & Fifteenth — a biweekly newsletter featuring his essays on human-computer interaction, conceptual art, and data justice. Michael’s design research aims to inspire audiences to adopt more critical responses to the utopian ideals prevalent in the technology industry today.
Michael has spoken at The Conference in Malmö, UNPARSED in London, UX Camp in Brighton, and Design Thinking Zeal. His writing has also been published by the Design Research Society.
BY MICHAEL KIBEDI
N/A-A
N/A-A (2023) is a larger-than-life installation made of perforated punch cards by Merve Mepa, an artist working at the intersection of material forms, cultural science, and computing culture. Each punch card has small holes, stitched together in a double-track and suspended from the ceiling of a gallery space. Mepa describes the algorithmically produced patterns as “reproducing itself according to neighbourhood relations in matrix systems, [becoming] a form of reproduction capable of weaving itself.” Despite the aesthetic and textural appeal, we must not lose sight of its utility.
Students of computing history will recognise punch cards as objects containing instructions required for early calculation engines. In the early twentieth century, a “computer” was a title given to women who translated requirements into the sequence of code represented by the holes seen on these punch cards.
We work with electronic devices capable of computing instructions of far greater complexity, however, we still rely on binary representation to capture data that exists in the world — a point argued compellingly by James Bridle in Ways of Being. Despite computer data being used to generate compelling images or operate complex predictive systems, it does not change its inherent limitations. Our data are abstractions of reality constructed from vast arrays of binary data: zero or one, true or false, on or off.
Fast forward to the twenty-first century and our understanding of interdependence in the natural world has advanced to such an extent, that we realise that the mental models we cling to forces binary thinking onto ecosystems where few exist.
Nature is littered with examples of plant and fungal life that defy binary categorisation. So what does it mean to deconstruct the binaries that form our data architectures which are often imposed as a result of our worldviews? Despite the richness of their descriptiveness, our data ultimately lacks the properties to respond to the unclarity of the lives they attempt to encapsulate.
There are several ways to interpret the deconstruction of the binary. This is partly a question of considering how we use data to address what Simone Browne terms the “ontological conditions of [our] blackness” (pg. 8) — how we capture the essence of a person’s existence in data. If our existence is reduced to a set of attributes, then a techno-optimist response frames the problem as one of missing detail — increase what we capture to improve the descriptive clarity of the subject. This is an approach that Meredith Broussard neatly trounces:
“When you write the kind of computer programs that slot people into neat categories in order to do data analysis, there is a tension between people's messy, shifting identities in the "real" world that rubs up against the sleek empiricism required to do the math that is under the hood in computers. This is most obvious when it comes to the gender binary and binary representation in computer systems.” — More Than A Glitch: Confronting Race, Gender, and Ability Bias in Tech, pg. 107
Multiplying the volume and granularity of our data is not the answer. Arguing whether we strive to capture the essence of personhood in a hundred, a thousand, or a million attributes misses the point. It is the thinking behind these actions — the techno-optimist belief that the messy entanglement of human identity can be reduced to a metaphorical pixelated data landscape — that is so problematic.
It is this wrong-headed thinking that extends the tendrils of smart, predictive decision-making solutions to assist in the administration of life-changing decisions — the veracity of an asylum claim based on an applicant’s dialect, the “wrong” sentiment detected in a job applicant’s voice, the likelihood of your marriage application being selected for scrutiny based on your nationality — most often these punitive steps land on the most vulnerable in our societies. Fraud detection within social welfare is a common arena for such solutions to be deployed, with recent examples wreaking havoc in Denmark and the Netherlands.
For now, we must look past the data and focus on the thinking that instructs its collection. When the vendors of smart solutions claim to be able to calculate the probability of future behaviours based on a set of defined data attributes — many of which are intrinsic or unchangeable; and when the attributes are insufficient to burnish the perception of the system’s intelligence — then ever more esoteric data are sourced to fill these gaps. In the Netherlands, where a welfare fraud risk system was deployed in Rotterdam:
“[The welfare fraud risk score algorithm] judges people on many characteristics they cannot control (like gender and ethnicity)… The data fed into the algorithm ranges from invasive (the length of someone’s last romantic relationship) and subjective (someone’s ability to convince and influence others) to banal (how many times someone has emailed the city) and seemingly irrelevant (whether someone plays sports).” — Inside the Suspicion Machine
Far from being a data quality or data science problem, we ought to treat the misrepresentation of personhood as an ontological concern — how are we producing existence in data? We should begin by interrogating the thinking that considers personhood as subjects to fit into rigid taxonomies. We need to question if the interiority of our being can (or rather, should) ever be appraised by an algorithm.
Our belief in datafication as a way of producing existence not only signals clumsy ontological capture, but it results in the elision of a person’s fullness of being. Patricia Hill Collins (citing Zuleyma Tang Halpin) summarises this emphatically in Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment that:
“… each term in the binaries white/black, male/female, reason/emotion, culture/nature, fact/opinion, mind/body, and subject/object gains meaning only in relation to its counterpart” — Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment, pg. 19
Collins, here, is concerned with examining the function of oppressive systems — a context different but not unrelated. In this light, it necessitates having a suspicion of how oppositional binaries originate, and the oppressive power structures they hide.
Oppression functions as an ecosystem. It is a short step to recognise that taking an interest in how human and non-human entities interrelate within environmental ecosystems is not unrelated. Such parallel realms are crucial for us all to ponder. The flora, fauna, and fungi remind us of the sometimes messy, sometimes contradictory ways we co-exist. If the birds and fungi can be insurgents within the taxonomies imposed upon them, what does that tell us about the lengths we go to resist the datafication of our own kind?
A surface-level concern about the discriminatory outcomes of predictive or algorithmic decisioning systems may spur us on to advocate for creating more descriptive data to “fix” the problem1, but look closer. The histories, ideologies and power structures that equates existence with a growing array of data attributes — existing individually as binary markers, interpreted collectively as a summation of what a datafied subject might think, say, or do in the future — are aims that overlook the reality of the fluidity, unknowing and ontological instability of each subject.
“… oppositional binaries rarely represent different but equal relationships, they are inherently unstable. Tension may be temporarily relieved by subordinating one half of the binary to the other. Thus Whites rule Blacks, men dominate women, reason is thought superior to emotion in ascertaining truth, facts supersede opinion in evaluating knowledge, and subjects rule objects.” — Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment, pg. 71
The binaries embedded in our systems, and from which so much of our language derives its descriptive power, are remnants of exclusionary power structures that have determined the boundaries of inclusion and exclusion.
“... the recursive features of our Western imaginary are reliant on the reinstatement of logical "truths" made actionable by systems of categorization” — The Black Technical Object, pg. 16
Ultimately data attributes form their subjects into categories — because categorisation is political. The ability to impose a worldview by defining how data and its attributes are to be partitioned demonstrates the possession of power that can bespeak the acceptance of subjects aligned to a dominant power structure while rendering unknown other subjects that trouble the foundations of these logics.
Blackness troubles the calculative gaze. Our technological desire to know the datafied subjects is spurred by a belief that the future can be known from the data of the past. The resemblance to fortune telling and divination is not accidental, particularly when the inner workings of some algorithmic decisioning systems are scrutinised. In Blackness, we become dark matter, to borrow Simone Browne’s words; a disruptive presence in an exclusionary technological landscape: “… when dark matter troubles algorithms in this way, it amounts to a refusal of the idea of neutrality when it comes to certain technologies” (pg. 162).
The view from the margins often elicits the greatest clarity. Data feminism’s fourth principle informs us of the importance of rethinking binaries. Layered over Patricia Hill Collin’s insight, we can go further to consider the binaries that we struggle against are disguising the oppositional and oppressive structures, artificially simplifying them to uphold a dominant worldview.
Ontological capture should become our concern. This is a phrase I am experimenting with. I think it neatly summarises the concern of a digital dragnet forming as we deploy smarter AI to address sociotechnical concerns, often with life-changing outcomes; and questions why we seek to capture the essence of a person’s being in data so each body is made calculable.
“The call is a broader and more fundamental one that recognises the mutual, unfolding enactments of ordering, classifying, producing and ultimately designing technology. This collapses the us-them, human-machine, inside-outside binaries and allows us to see technology and its design not as a recapitulation of disciplinary tropes or tidy conceptual categories, but as a means of participating in unfolding ways of knowing, being and doing.“ — Out There
Thinking past binaries means letting go of our industry’s dogma that societal problems are questions that can only be answered with technological solutions. Thinking past binaries means embracing the unknowing and accepting discomfort as a settled state — which exists in contrast to the obsession with capturing the world in data to make its subjects knowable, searchable, and examinable as datafied objects.
IMAGE CREDITS
N/A-A(2023) by Merve Mepa is a sculptural work that asks searching questions on the nature of agency, predestination, and control in computerised environments.
Detail from Numbers and Faces: Multi-Racial/Ethnic Combinations Series 1: Face #11, Martina Crouch (Nigerian Igbo Tribe/White) (2020) by Charles Gaines exhibited in 2021 at the show “Multiples of Natures, Trees and Faces” (Hauser & Wirth, London); a series that questions the political and cultural ideas that shape our understanding of multi-racial identity and the way we produce difference in data
N/A-A
N/A-A (2023) is a larger-than-life installation made of perforated punch cards by Merve Mepa, an artist working at the intersection of material forms, cultural science, and computing culture. Each punch card has small holes, stitched together in a double-track and suspended from the ceiling of a gallery space. Mepa describes the algorithmically produced patterns as “reproducing itself according to neighbourhood relations in matrix systems, [becoming] a form of reproduction capable of weaving itself.” Despite the aesthetic and textural appeal, we must not lose sight of its utility.
Students of computing history will recognise punch cards as objects containing instructions required for early calculation engines. In the early twentieth century, a “computer” was a title given to women who translated requirements into the sequence of code represented by the holes seen on these punch cards.
We work with electronic devices capable of computing instructions of far greater complexity, however, we still rely on binary representation to capture data that exists in the world — a point argued compellingly by James Bridle in Ways of Being. Despite computer data being used to generate compelling images or operate complex predictive systems, it does not change its inherent limitations. Our data are abstractions of reality constructed from vast arrays of binary data: zero or one, true or false, on or off.
Fast forward to the twenty-first century and our understanding of interdependence in the natural world has advanced to such an extent, that we realise that the mental models we cling to forces binary thinking onto ecosystems where few exist.
Nature is littered with examples of plant and fungal life that defy binary categorisation. So what does it mean to deconstruct the binaries that form our data architectures which are often imposed as a result of our worldviews? Despite the richness of their descriptiveness, our data ultimately lacks the properties to respond to the unclarity of the lives they attempt to encapsulate.
There are several ways to interpret the deconstruction of the binary. This is partly a question of considering how we use data to address what Simone Browne terms the “ontological conditions of [our] blackness” (pg. 8) — how we capture the essence of a person’s existence in data. If our existence is reduced to a set of attributes, then a techno-optimist response frames the problem as one of missing detail — increase what we capture to improve the descriptive clarity of the subject. This is an approach that Meredith Broussard neatly trounces:
“When you write the kind of computer programs that slot people into neat categories in order to do data analysis, there is a tension between people's messy, shifting identities in the "real" world that rubs up against the sleek empiricism required to do the math that is under the hood in computers. This is most obvious when it comes to the gender binary and binary representation in computer systems.” — More Than A Glitch: Confronting Race, Gender, and Ability Bias in Tech, pg. 107
Multiplying the volume and granularity of our data is not the answer. Arguing whether we strive to capture the essence of personhood in a hundred, a thousand, or a million attributes misses the point. It is the thinking behind these actions — the techno-optimist belief that the messy entanglement of human identity can be reduced to a metaphorical pixelated data landscape — that is so problematic.
It is this wrong-headed thinking that extends the tendrils of smart, predictive decision-making solutions to assist in the administration of life-changing decisions — the veracity of an asylum claim based on an applicant’s dialect, the “wrong” sentiment detected in a job applicant’s voice, the likelihood of your marriage application being selected for scrutiny based on your nationality — most often these punitive steps land on the most vulnerable in our societies. Fraud detection within social welfare is a common arena for such solutions to be deployed, with recent examples wreaking havoc in Denmark and the Netherlands.
For now, we must look past the data and focus on the thinking that instructs its collection. When the vendors of smart solutions claim to be able to calculate the probability of future behaviours based on a set of defined data attributes — many of which are intrinsic or unchangeable; and when the attributes are insufficient to burnish the perception of the system’s intelligence — then ever more esoteric data are sourced to fill these gaps. In the Netherlands, where a welfare fraud risk system was deployed in Rotterdam:
“[The welfare fraud risk score algorithm] judges people on many characteristics they cannot control (like gender and ethnicity)… The data fed into the algorithm ranges from invasive (the length of someone’s last romantic relationship) and subjective (someone’s ability to convince and influence others) to banal (how many times someone has emailed the city) and seemingly irrelevant (whether someone plays sports).” — Inside the Suspicion Machine
Far from being a data quality or data science problem, we ought to treat the misrepresentation of personhood as an ontological concern — how are we producing existence in data? We should begin by interrogating the thinking that considers personhood as subjects to fit into rigid taxonomies. We need to question if the interiority of our being can (or rather, should) ever be appraised by an algorithm.
Our belief in datafication as a way of producing existence not only signals clumsy ontological capture, but it results in the elision of a person’s fullness of being. Patricia Hill Collins (citing Zuleyma Tang Halpin) summarises this emphatically in Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment that:
“… each term in the binaries white/black, male/female, reason/emotion, culture/nature, fact/opinion, mind/body, and subject/object gains meaning only in relation to its counterpart” — Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment, pg. 19
Collins, here, is concerned with examining the function of oppressive systems — a context different but not unrelated. In this light, it necessitates having a suspicion of how oppositional binaries originate, and the oppressive power structures they hide.
Oppression functions as an ecosystem. It is a short step to recognise that taking an interest in how human and non-human entities interrelate within environmental ecosystems is not unrelated. Such parallel realms are crucial for us all to ponder. The flora, fauna, and fungi remind us of the sometimes messy, sometimes contradictory ways we co-exist. If the birds and fungi can be insurgents within the taxonomies imposed upon them, what does that tell us about the lengths we go to resist the datafication of our own kind?
A surface-level concern about the discriminatory outcomes of predictive or algorithmic decisioning systems may spur us on to advocate for creating more descriptive data to “fix” the problem1, but look closer. The histories, ideologies and power structures that equates existence with a growing array of data attributes — existing individually as binary markers, interpreted collectively as a summation of what a datafied subject might think, say, or do in the future — are aims that overlook the reality of the fluidity, unknowing and ontological instability of each subject.
“… oppositional binaries rarely represent different but equal relationships, they are inherently unstable. Tension may be temporarily relieved by subordinating one half of the binary to the other. Thus Whites rule Blacks, men dominate women, reason is thought superior to emotion in ascertaining truth, facts supersede opinion in evaluating knowledge, and subjects rule objects.” — Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment, pg. 71
The binaries embedded in our systems, and from which so much of our language derives its descriptive power, are remnants of exclusionary power structures that have determined the boundaries of inclusion and exclusion.
“... the recursive features of our Western imaginary are reliant on the reinstatement of logical "truths" made actionable by systems of categorization” — The Black Technical Object, pg. 16
Ultimately data attributes form their subjects into categories — because categorisation is political. The ability to impose a worldview by defining how data and its attributes are to be partitioned demonstrates the possession of power that can bespeak the acceptance of subjects aligned to a dominant power structure while rendering unknown other subjects that trouble the foundations of these logics.
Blackness troubles the calculative gaze. Our technological desire to know the datafied subjects is spurred by a belief that the future can be known from the data of the past. The resemblance to fortune telling and divination is not accidental, particularly when the inner workings of some algorithmic decisioning systems are scrutinised. In Blackness, we become dark matter, to borrow Simone Browne’s words; a disruptive presence in an exclusionary technological landscape: “… when dark matter troubles algorithms in this way, it amounts to a refusal of the idea of neutrality when it comes to certain technologies” (pg. 162).
The view from the margins often elicits the greatest clarity. Data feminism’s fourth principle informs us of the importance of rethinking binaries. Layered over Patricia Hill Collin’s insight, we can go further to consider the binaries that we struggle against are disguising the oppositional and oppressive structures, artificially simplifying them to uphold a dominant worldview.
Ontological capture should become our concern. This is a phrase I am experimenting with. I think it neatly summarises the concern of a digital dragnet forming as we deploy smarter AI to address sociotechnical concerns, often with life-changing outcomes; and questions why we seek to capture the essence of a person’s being in data so each body is made calculable.
“The call is a broader and more fundamental one that recognises the mutual, unfolding enactments of ordering, classifying, producing and ultimately designing technology. This collapses the us-them, human-machine, inside-outside binaries and allows us to see technology and its design not as a recapitulation of disciplinary tropes or tidy conceptual categories, but as a means of participating in unfolding ways of knowing, being and doing.“ — Out There
Thinking past binaries means letting go of our industry’s dogma that societal problems are questions that can only be answered with technological solutions. Thinking past binaries means embracing the unknowing and accepting discomfort as a settled state — which exists in contrast to the obsession with capturing the world in data to make its subjects knowable, searchable, and examinable as datafied objects.
IMAGE CREDITS
N/A-A(2023) by Merve Mepa is a sculptural work that asks searching questions on the nature of agency, predestination, and control in computerised environments.
Detail from Numbers and Faces: Multi-Racial/Ethnic Combinations Series 1: Face #11, Martina Crouch (Nigerian Igbo Tribe/White) (2020) by Charles Gaines exhibited in 2021 at the show “Multiples of Natures, Trees and Faces” (Hauser & Wirth, London); a series that questions the political and cultural ideas that shape our understanding of multi-racial identity and the way we produce difference in data
Michael Kibedi is a design researcher and writer of First & Fifteenth — a biweekly newsletter featuring his essays on human-computer interaction, conceptual art, and data justice. Michael’s design research aims to inspire audiences to adopt more critical responses to the utopian ideals prevalent in the technology industry today.
Michael has spoken at The Conference in Malmö, UNPARSED in London, UX Camp in Brighton, and Design Thinking Zeal. His writing has also been published by the Design Research Society.
BY MICHAEL KIBEDI
N/A-A
N/A-A (2023) is a larger-than-life installation made of perforated punch cards by Merve Mepa, an artist working at the intersection of material forms, cultural science, and computing culture. Each punch card has small holes, stitched together in a double-track and suspended from the ceiling of a gallery space. Mepa describes the algorithmically produced patterns as “reproducing itself according to neighbourhood relations in matrix systems, [becoming] a form of reproduction capable of weaving itself.” Despite the aesthetic and textural appeal, we must not lose sight of its utility.
Students of computing history will recognise punch cards as objects containing instructions required for early calculation engines. In the early twentieth century, a “computer” was a title given to women who translated requirements into the sequence of code represented by the holes seen on these punch cards.
We work with electronic devices capable of computing instructions of far greater complexity, however, we still rely on binary representation to capture data that exists in the world — a point argued compellingly by James Bridle in Ways of Being. Despite computer data being used to generate compelling images or operate complex predictive systems, it does not change its inherent limitations. Our data are abstractions of reality constructed from vast arrays of binary data: zero or one, true or false, on or off.
Fast forward to the twenty-first century and our understanding of interdependence in the natural world has advanced to such an extent, that we realise that the mental models we cling to forces binary thinking onto ecosystems where few exist.
Nature is littered with examples of plant and fungal life that defy binary categorisation. So what does it mean to deconstruct the binaries that form our data architectures which are often imposed as a result of our worldviews? Despite the richness of their descriptiveness, our data ultimately lacks the properties to respond to the unclarity of the lives they attempt to encapsulate.
There are several ways to interpret the deconstruction of the binary. This is partly a question of considering how we use data to address what Simone Browne terms the “ontological conditions of [our] blackness” (pg. 8) — how we capture the essence of a person’s existence in data. If our existence is reduced to a set of attributes, then a techno-optimist response frames the problem as one of missing detail — increase what we capture to improve the descriptive clarity of the subject. This is an approach that Meredith Broussard neatly trounces:
“When you write the kind of computer programs that slot people into neat categories in order to do data analysis, there is a tension between people's messy, shifting identities in the "real" world that rubs up against the sleek empiricism required to do the math that is under the hood in computers. This is most obvious when it comes to the gender binary and binary representation in computer systems.” — More Than A Glitch: Confronting Race, Gender, and Ability Bias in Tech, pg. 107
Multiplying the volume and granularity of our data is not the answer. Arguing whether we strive to capture the essence of personhood in a hundred, a thousand, or a million attributes misses the point. It is the thinking behind these actions — the techno-optimist belief that the messy entanglement of human identity can be reduced to a metaphorical pixelated data landscape — that is so problematic.
It is this wrong-headed thinking that extends the tendrils of smart, predictive decision-making solutions to assist in the administration of life-changing decisions — the veracity of an asylum claim based on an applicant’s dialect, the “wrong” sentiment detected in a job applicant’s voice, the likelihood of your marriage application being selected for scrutiny based on your nationality — most often these punitive steps land on the most vulnerable in our societies. Fraud detection within social welfare is a common arena for such solutions to be deployed, with recent examples wreaking havoc in Denmark and the Netherlands.
For now, we must look past the data and focus on the thinking that instructs its collection. When the vendors of smart solutions claim to be able to calculate the probability of future behaviours based on a set of defined data attributes — many of which are intrinsic or unchangeable; and when the attributes are insufficient to burnish the perception of the system’s intelligence — then ever more esoteric data are sourced to fill these gaps. In the Netherlands, where a welfare fraud risk system was deployed in Rotterdam:
“[The welfare fraud risk score algorithm] judges people on many characteristics they cannot control (like gender and ethnicity)… The data fed into the algorithm ranges from invasive (the length of someone’s last romantic relationship) and subjective (someone’s ability to convince and influence others) to banal (how many times someone has emailed the city) and seemingly irrelevant (whether someone plays sports).” — Inside the Suspicion Machine
Far from being a data quality or data science problem, we ought to treat the misrepresentation of personhood as an ontological concern — how are we producing existence in data? We should begin by interrogating the thinking that considers personhood as subjects to fit into rigid taxonomies. We need to question if the interiority of our being can (or rather, should) ever be appraised by an algorithm.
Our belief in datafication as a way of producing existence not only signals clumsy ontological capture, but it results in the elision of a person’s fullness of being. Patricia Hill Collins (citing Zuleyma Tang Halpin) summarises this emphatically in Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment that:
“… each term in the binaries white/black, male/female, reason/emotion, culture/nature, fact/opinion, mind/body, and subject/object gains meaning only in relation to its counterpart” — Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment, pg. 19
Collins, here, is concerned with examining the function of oppressive systems — a context different but not unrelated. In this light, it necessitates having a suspicion of how oppositional binaries originate, and the oppressive power structures they hide.
Oppression functions as an ecosystem. It is a short step to recognise that taking an interest in how human and non-human entities interrelate within environmental ecosystems is not unrelated. Such parallel realms are crucial for us all to ponder. The flora, fauna, and fungi remind us of the sometimes messy, sometimes contradictory ways we co-exist. If the birds and fungi can be insurgents within the taxonomies imposed upon them, what does that tell us about the lengths we go to resist the datafication of our own kind?
A surface-level concern about the discriminatory outcomes of predictive or algorithmic decisioning systems may spur us on to advocate for creating more descriptive data to “fix” the problem1, but look closer. The histories, ideologies and power structures that equates existence with a growing array of data attributes — existing individually as binary markers, interpreted collectively as a summation of what a datafied subject might think, say, or do in the future — are aims that overlook the reality of the fluidity, unknowing and ontological instability of each subject.
“… oppositional binaries rarely represent different but equal relationships, they are inherently unstable. Tension may be temporarily relieved by subordinating one half of the binary to the other. Thus Whites rule Blacks, men dominate women, reason is thought superior to emotion in ascertaining truth, facts supersede opinion in evaluating knowledge, and subjects rule objects.” — Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment, pg. 71
The binaries embedded in our systems, and from which so much of our language derives its descriptive power, are remnants of exclusionary power structures that have determined the boundaries of inclusion and exclusion.
“... the recursive features of our Western imaginary are reliant on the reinstatement of logical "truths" made actionable by systems of categorization” — The Black Technical Object, pg. 16
Ultimately data attributes form their subjects into categories — because categorisation is political. The ability to impose a worldview by defining how data and its attributes are to be partitioned demonstrates the possession of power that can bespeak the acceptance of subjects aligned to a dominant power structure while rendering unknown other subjects that trouble the foundations of these logics.
Blackness troubles the calculative gaze. Our technological desire to know the datafied subjects is spurred by a belief that the future can be known from the data of the past. The resemblance to fortune telling and divination is not accidental, particularly when the inner workings of some algorithmic decisioning systems are scrutinised. In Blackness, we become dark matter, to borrow Simone Browne’s words; a disruptive presence in an exclusionary technological landscape: “… when dark matter troubles algorithms in this way, it amounts to a refusal of the idea of neutrality when it comes to certain technologies” (pg. 162).
The view from the margins often elicits the greatest clarity. Data feminism’s fourth principle informs us of the importance of rethinking binaries. Layered over Patricia Hill Collin’s insight, we can go further to consider the binaries that we struggle against are disguising the oppositional and oppressive structures, artificially simplifying them to uphold a dominant worldview.
Ontological capture should become our concern. This is a phrase I am experimenting with. I think it neatly summarises the concern of a digital dragnet forming as we deploy smarter AI to address sociotechnical concerns, often with life-changing outcomes; and questions why we seek to capture the essence of a person’s being in data so each body is made calculable.
“The call is a broader and more fundamental one that recognises the mutual, unfolding enactments of ordering, classifying, producing and ultimately designing technology. This collapses the us-them, human-machine, inside-outside binaries and allows us to see technology and its design not as a recapitulation of disciplinary tropes or tidy conceptual categories, but as a means of participating in unfolding ways of knowing, being and doing.“ — Out There
Thinking past binaries means letting go of our industry’s dogma that societal problems are questions that can only be answered with technological solutions. Thinking past binaries means embracing the unknowing and accepting discomfort as a settled state — which exists in contrast to the obsession with capturing the world in data to make its subjects knowable, searchable, and examinable as datafied objects.
IMAGE CREDITS
N/A-A(2023) by Merve Mepa is a sculptural work that asks searching questions on the nature of agency, predestination, and control in computerised environments.
Detail from Numbers and Faces: Multi-Racial/Ethnic Combinations Series 1: Face #11, Martina Crouch (Nigerian Igbo Tribe/White) (2020) by Charles Gaines exhibited in 2021 at the show “Multiples of Natures, Trees and Faces” (Hauser & Wirth, London); a series that questions the political and cultural ideas that shape our understanding of multi-racial identity and the way we produce difference in data
N/A-A
N/A-A (2023) is a larger-than-life installation made of perforated punch cards by Merve Mepa, an artist working at the intersection of material forms, cultural science, and computing culture. Each punch card has small holes, stitched together in a double-track and suspended from the ceiling of a gallery space. Mepa describes the algorithmically produced patterns as “reproducing itself according to neighbourhood relations in matrix systems, [becoming] a form of reproduction capable of weaving itself.” Despite the aesthetic and textural appeal, we must not lose sight of its utility.
Students of computing history will recognise punch cards as objects containing instructions required for early calculation engines. In the early twentieth century, a “computer” was a title given to women who translated requirements into the sequence of code represented by the holes seen on these punch cards.
We work with electronic devices capable of computing instructions of far greater complexity, however, we still rely on binary representation to capture data that exists in the world — a point argued compellingly by James Bridle in Ways of Being. Despite computer data being used to generate compelling images or operate complex predictive systems, it does not change its inherent limitations. Our data are abstractions of reality constructed from vast arrays of binary data: zero or one, true or false, on or off.
Fast forward to the twenty-first century and our understanding of interdependence in the natural world has advanced to such an extent, that we realise that the mental models we cling to forces binary thinking onto ecosystems where few exist.
Nature is littered with examples of plant and fungal life that defy binary categorisation. So what does it mean to deconstruct the binaries that form our data architectures which are often imposed as a result of our worldviews? Despite the richness of their descriptiveness, our data ultimately lacks the properties to respond to the unclarity of the lives they attempt to encapsulate.
There are several ways to interpret the deconstruction of the binary. This is partly a question of considering how we use data to address what Simone Browne terms the “ontological conditions of [our] blackness” (pg. 8) — how we capture the essence of a person’s existence in data. If our existence is reduced to a set of attributes, then a techno-optimist response frames the problem as one of missing detail — increase what we capture to improve the descriptive clarity of the subject. This is an approach that Meredith Broussard neatly trounces:
“When you write the kind of computer programs that slot people into neat categories in order to do data analysis, there is a tension between people's messy, shifting identities in the "real" world that rubs up against the sleek empiricism required to do the math that is under the hood in computers. This is most obvious when it comes to the gender binary and binary representation in computer systems.” — More Than A Glitch: Confronting Race, Gender, and Ability Bias in Tech, pg. 107
Multiplying the volume and granularity of our data is not the answer. Arguing whether we strive to capture the essence of personhood in a hundred, a thousand, or a million attributes misses the point. It is the thinking behind these actions — the techno-optimist belief that the messy entanglement of human identity can be reduced to a metaphorical pixelated data landscape — that is so problematic.
It is this wrong-headed thinking that extends the tendrils of smart, predictive decision-making solutions to assist in the administration of life-changing decisions — the veracity of an asylum claim based on an applicant’s dialect, the “wrong” sentiment detected in a job applicant’s voice, the likelihood of your marriage application being selected for scrutiny based on your nationality — most often these punitive steps land on the most vulnerable in our societies. Fraud detection within social welfare is a common arena for such solutions to be deployed, with recent examples wreaking havoc in Denmark and the Netherlands.
For now, we must look past the data and focus on the thinking that instructs its collection. When the vendors of smart solutions claim to be able to calculate the probability of future behaviours based on a set of defined data attributes — many of which are intrinsic or unchangeable; and when the attributes are insufficient to burnish the perception of the system’s intelligence — then ever more esoteric data are sourced to fill these gaps. In the Netherlands, where a welfare fraud risk system was deployed in Rotterdam:
“[The welfare fraud risk score algorithm] judges people on many characteristics they cannot control (like gender and ethnicity)… The data fed into the algorithm ranges from invasive (the length of someone’s last romantic relationship) and subjective (someone’s ability to convince and influence others) to banal (how many times someone has emailed the city) and seemingly irrelevant (whether someone plays sports).” — Inside the Suspicion Machine
Far from being a data quality or data science problem, we ought to treat the misrepresentation of personhood as an ontological concern — how are we producing existence in data? We should begin by interrogating the thinking that considers personhood as subjects to fit into rigid taxonomies. We need to question if the interiority of our being can (or rather, should) ever be appraised by an algorithm.
Our belief in datafication as a way of producing existence not only signals clumsy ontological capture, but it results in the elision of a person’s fullness of being. Patricia Hill Collins (citing Zuleyma Tang Halpin) summarises this emphatically in Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment that:
“… each term in the binaries white/black, male/female, reason/emotion, culture/nature, fact/opinion, mind/body, and subject/object gains meaning only in relation to its counterpart” — Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment, pg. 19
Collins, here, is concerned with examining the function of oppressive systems — a context different but not unrelated. In this light, it necessitates having a suspicion of how oppositional binaries originate, and the oppressive power structures they hide.
Oppression functions as an ecosystem. It is a short step to recognise that taking an interest in how human and non-human entities interrelate within environmental ecosystems is not unrelated. Such parallel realms are crucial for us all to ponder. The flora, fauna, and fungi remind us of the sometimes messy, sometimes contradictory ways we co-exist. If the birds and fungi can be insurgents within the taxonomies imposed upon them, what does that tell us about the lengths we go to resist the datafication of our own kind?
A surface-level concern about the discriminatory outcomes of predictive or algorithmic decisioning systems may spur us on to advocate for creating more descriptive data to “fix” the problem1, but look closer. The histories, ideologies and power structures that equates existence with a growing array of data attributes — existing individually as binary markers, interpreted collectively as a summation of what a datafied subject might think, say, or do in the future — are aims that overlook the reality of the fluidity, unknowing and ontological instability of each subject.
“… oppositional binaries rarely represent different but equal relationships, they are inherently unstable. Tension may be temporarily relieved by subordinating one half of the binary to the other. Thus Whites rule Blacks, men dominate women, reason is thought superior to emotion in ascertaining truth, facts supersede opinion in evaluating knowledge, and subjects rule objects.” — Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment, pg. 71
The binaries embedded in our systems, and from which so much of our language derives its descriptive power, are remnants of exclusionary power structures that have determined the boundaries of inclusion and exclusion.
“... the recursive features of our Western imaginary are reliant on the reinstatement of logical "truths" made actionable by systems of categorization” — The Black Technical Object, pg. 16
Ultimately data attributes form their subjects into categories — because categorisation is political. The ability to impose a worldview by defining how data and its attributes are to be partitioned demonstrates the possession of power that can bespeak the acceptance of subjects aligned to a dominant power structure while rendering unknown other subjects that trouble the foundations of these logics.
Blackness troubles the calculative gaze. Our technological desire to know the datafied subjects is spurred by a belief that the future can be known from the data of the past. The resemblance to fortune telling and divination is not accidental, particularly when the inner workings of some algorithmic decisioning systems are scrutinised. In Blackness, we become dark matter, to borrow Simone Browne’s words; a disruptive presence in an exclusionary technological landscape: “… when dark matter troubles algorithms in this way, it amounts to a refusal of the idea of neutrality when it comes to certain technologies” (pg. 162).
The view from the margins often elicits the greatest clarity. Data feminism’s fourth principle informs us of the importance of rethinking binaries. Layered over Patricia Hill Collin’s insight, we can go further to consider the binaries that we struggle against are disguising the oppositional and oppressive structures, artificially simplifying them to uphold a dominant worldview.
Ontological capture should become our concern. This is a phrase I am experimenting with. I think it neatly summarises the concern of a digital dragnet forming as we deploy smarter AI to address sociotechnical concerns, often with life-changing outcomes; and questions why we seek to capture the essence of a person’s being in data so each body is made calculable.
“The call is a broader and more fundamental one that recognises the mutual, unfolding enactments of ordering, classifying, producing and ultimately designing technology. This collapses the us-them, human-machine, inside-outside binaries and allows us to see technology and its design not as a recapitulation of disciplinary tropes or tidy conceptual categories, but as a means of participating in unfolding ways of knowing, being and doing.“ — Out There
Thinking past binaries means letting go of our industry’s dogma that societal problems are questions that can only be answered with technological solutions. Thinking past binaries means embracing the unknowing and accepting discomfort as a settled state — which exists in contrast to the obsession with capturing the world in data to make its subjects knowable, searchable, and examinable as datafied objects.
IMAGE CREDITS
N/A-A(2023) by Merve Mepa is a sculptural work that asks searching questions on the nature of agency, predestination, and control in computerised environments.
Detail from Numbers and Faces: Multi-Racial/Ethnic Combinations Series 1: Face #11, Martina Crouch (Nigerian Igbo Tribe/White) (2020) by Charles Gaines exhibited in 2021 at the show “Multiples of Natures, Trees and Faces” (Hauser & Wirth, London); a series that questions the political and cultural ideas that shape our understanding of multi-racial identity and the way we produce difference in data
Michael Kibedi is a design researcher and writer of First & Fifteenth — a biweekly newsletter featuring his essays on human-computer interaction, conceptual art, and data justice. Michael’s design research aims to inspire audiences to adopt more critical responses to the utopian ideals prevalent in the technology industry today.
Michael has spoken at The Conference in Malmö, UNPARSED in London, UX Camp in Brighton, and Design Thinking Zeal. His writing has also been published by the Design Research Society.
BY MICHAEL KIBEDI
N/A-A
N/A-A (2023) is a larger-than-life installation made of perforated punch cards by Merve Mepa, an artist working at the intersection of material forms, cultural science, and computing culture. Each punch card has small holes, stitched together in a double-track and suspended from the ceiling of a gallery space. Mepa describes the algorithmically produced patterns as “reproducing itself according to neighbourhood relations in matrix systems, [becoming] a form of reproduction capable of weaving itself.” Despite the aesthetic and textural appeal, we must not lose sight of its utility.
Students of computing history will recognise punch cards as objects containing instructions required for early calculation engines. In the early twentieth century, a “computer” was a title given to women who translated requirements into the sequence of code represented by the holes seen on these punch cards.
We work with electronic devices capable of computing instructions of far greater complexity, however, we still rely on binary representation to capture data that exists in the world — a point argued compellingly by James Bridle in Ways of Being. Despite computer data being used to generate compelling images or operate complex predictive systems, it does not change its inherent limitations. Our data are abstractions of reality constructed from vast arrays of binary data: zero or one, true or false, on or off.
Fast forward to the twenty-first century and our understanding of interdependence in the natural world has advanced to such an extent, that we realise that the mental models we cling to forces binary thinking onto ecosystems where few exist.
Nature is littered with examples of plant and fungal life that defy binary categorisation. So what does it mean to deconstruct the binaries that form our data architectures which are often imposed as a result of our worldviews? Despite the richness of their descriptiveness, our data ultimately lacks the properties to respond to the unclarity of the lives they attempt to encapsulate.
There are several ways to interpret the deconstruction of the binary. This is partly a question of considering how we use data to address what Simone Browne terms the “ontological conditions of [our] blackness” (pg. 8) — how we capture the essence of a person’s existence in data. If our existence is reduced to a set of attributes, then a techno-optimist response frames the problem as one of missing detail — increase what we capture to improve the descriptive clarity of the subject. This is an approach that Meredith Broussard neatly trounces:
“When you write the kind of computer programs that slot people into neat categories in order to do data analysis, there is a tension between people's messy, shifting identities in the "real" world that rubs up against the sleek empiricism required to do the math that is under the hood in computers. This is most obvious when it comes to the gender binary and binary representation in computer systems.” — More Than A Glitch: Confronting Race, Gender, and Ability Bias in Tech, pg. 107
Multiplying the volume and granularity of our data is not the answer. Arguing whether we strive to capture the essence of personhood in a hundred, a thousand, or a million attributes misses the point. It is the thinking behind these actions — the techno-optimist belief that the messy entanglement of human identity can be reduced to a metaphorical pixelated data landscape — that is so problematic.
It is this wrong-headed thinking that extends the tendrils of smart, predictive decision-making solutions to assist in the administration of life-changing decisions — the veracity of an asylum claim based on an applicant’s dialect, the “wrong” sentiment detected in a job applicant’s voice, the likelihood of your marriage application being selected for scrutiny based on your nationality — most often these punitive steps land on the most vulnerable in our societies. Fraud detection within social welfare is a common arena for such solutions to be deployed, with recent examples wreaking havoc in Denmark and the Netherlands.
For now, we must look past the data and focus on the thinking that instructs its collection. When the vendors of smart solutions claim to be able to calculate the probability of future behaviours based on a set of defined data attributes — many of which are intrinsic or unchangeable; and when the attributes are insufficient to burnish the perception of the system’s intelligence — then ever more esoteric data are sourced to fill these gaps. In the Netherlands, where a welfare fraud risk system was deployed in Rotterdam:
“[The welfare fraud risk score algorithm] judges people on many characteristics they cannot control (like gender and ethnicity)… The data fed into the algorithm ranges from invasive (the length of someone’s last romantic relationship) and subjective (someone’s ability to convince and influence others) to banal (how many times someone has emailed the city) and seemingly irrelevant (whether someone plays sports).” — Inside the Suspicion Machine
Far from being a data quality or data science problem, we ought to treat the misrepresentation of personhood as an ontological concern — how are we producing existence in data? We should begin by interrogating the thinking that considers personhood as subjects to fit into rigid taxonomies. We need to question if the interiority of our being can (or rather, should) ever be appraised by an algorithm.
Our belief in datafication as a way of producing existence not only signals clumsy ontological capture, but it results in the elision of a person’s fullness of being. Patricia Hill Collins (citing Zuleyma Tang Halpin) summarises this emphatically in Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment that:
“… each term in the binaries white/black, male/female, reason/emotion, culture/nature, fact/opinion, mind/body, and subject/object gains meaning only in relation to its counterpart” — Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment, pg. 19
Collins, here, is concerned with examining the function of oppressive systems — a context different but not unrelated. In this light, it necessitates having a suspicion of how oppositional binaries originate, and the oppressive power structures they hide.
Oppression functions as an ecosystem. It is a short step to recognise that taking an interest in how human and non-human entities interrelate within environmental ecosystems is not unrelated. Such parallel realms are crucial for us all to ponder. The flora, fauna, and fungi remind us of the sometimes messy, sometimes contradictory ways we co-exist. If the birds and fungi can be insurgents within the taxonomies imposed upon them, what does that tell us about the lengths we go to resist the datafication of our own kind?
A surface-level concern about the discriminatory outcomes of predictive or algorithmic decisioning systems may spur us on to advocate for creating more descriptive data to “fix” the problem1, but look closer. The histories, ideologies and power structures that equates existence with a growing array of data attributes — existing individually as binary markers, interpreted collectively as a summation of what a datafied subject might think, say, or do in the future — are aims that overlook the reality of the fluidity, unknowing and ontological instability of each subject.
“… oppositional binaries rarely represent different but equal relationships, they are inherently unstable. Tension may be temporarily relieved by subordinating one half of the binary to the other. Thus Whites rule Blacks, men dominate women, reason is thought superior to emotion in ascertaining truth, facts supersede opinion in evaluating knowledge, and subjects rule objects.” — Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment, pg. 71
The binaries embedded in our systems, and from which so much of our language derives its descriptive power, are remnants of exclusionary power structures that have determined the boundaries of inclusion and exclusion.
“... the recursive features of our Western imaginary are reliant on the reinstatement of logical "truths" made actionable by systems of categorization” — The Black Technical Object, pg. 16
Ultimately data attributes form their subjects into categories — because categorisation is political. The ability to impose a worldview by defining how data and its attributes are to be partitioned demonstrates the possession of power that can bespeak the acceptance of subjects aligned to a dominant power structure while rendering unknown other subjects that trouble the foundations of these logics.
Blackness troubles the calculative gaze. Our technological desire to know the datafied subjects is spurred by a belief that the future can be known from the data of the past. The resemblance to fortune telling and divination is not accidental, particularly when the inner workings of some algorithmic decisioning systems are scrutinised. In Blackness, we become dark matter, to borrow Simone Browne’s words; a disruptive presence in an exclusionary technological landscape: “… when dark matter troubles algorithms in this way, it amounts to a refusal of the idea of neutrality when it comes to certain technologies” (pg. 162).
The view from the margins often elicits the greatest clarity. Data feminism’s fourth principle informs us of the importance of rethinking binaries. Layered over Patricia Hill Collin’s insight, we can go further to consider the binaries that we struggle against are disguising the oppositional and oppressive structures, artificially simplifying them to uphold a dominant worldview.
Ontological capture should become our concern. This is a phrase I am experimenting with. I think it neatly summarises the concern of a digital dragnet forming as we deploy smarter AI to address sociotechnical concerns, often with life-changing outcomes; and questions why we seek to capture the essence of a person’s being in data so each body is made calculable.
“The call is a broader and more fundamental one that recognises the mutual, unfolding enactments of ordering, classifying, producing and ultimately designing technology. This collapses the us-them, human-machine, inside-outside binaries and allows us to see technology and its design not as a recapitulation of disciplinary tropes or tidy conceptual categories, but as a means of participating in unfolding ways of knowing, being and doing.“ — Out There
Thinking past binaries means letting go of our industry’s dogma that societal problems are questions that can only be answered with technological solutions. Thinking past binaries means embracing the unknowing and accepting discomfort as a settled state — which exists in contrast to the obsession with capturing the world in data to make its subjects knowable, searchable, and examinable as datafied objects.
IMAGE CREDITS
N/A-A(2023) by Merve Mepa is a sculptural work that asks searching questions on the nature of agency, predestination, and control in computerised environments.
Detail from Numbers and Faces: Multi-Racial/Ethnic Combinations Series 1: Face #11, Martina Crouch (Nigerian Igbo Tribe/White) (2020) by Charles Gaines exhibited in 2021 at the show “Multiples of Natures, Trees and Faces” (Hauser & Wirth, London); a series that questions the political and cultural ideas that shape our understanding of multi-racial identity and the way we produce difference in data
N/A-A
N/A-A (2023) is a larger-than-life installation made of perforated punch cards by Merve Mepa, an artist working at the intersection of material forms, cultural science, and computing culture. Each punch card has small holes, stitched together in a double-track and suspended from the ceiling of a gallery space. Mepa describes the algorithmically produced patterns as “reproducing itself according to neighbourhood relations in matrix systems, [becoming] a form of reproduction capable of weaving itself.” Despite the aesthetic and textural appeal, we must not lose sight of its utility.
Students of computing history will recognise punch cards as objects containing instructions required for early calculation engines. In the early twentieth century, a “computer” was a title given to women who translated requirements into the sequence of code represented by the holes seen on these punch cards.
We work with electronic devices capable of computing instructions of far greater complexity, however, we still rely on binary representation to capture data that exists in the world — a point argued compellingly by James Bridle in Ways of Being. Despite computer data being used to generate compelling images or operate complex predictive systems, it does not change its inherent limitations. Our data are abstractions of reality constructed from vast arrays of binary data: zero or one, true or false, on or off.
Fast forward to the twenty-first century and our understanding of interdependence in the natural world has advanced to such an extent, that we realise that the mental models we cling to forces binary thinking onto ecosystems where few exist.
Nature is littered with examples of plant and fungal life that defy binary categorisation. So what does it mean to deconstruct the binaries that form our data architectures which are often imposed as a result of our worldviews? Despite the richness of their descriptiveness, our data ultimately lacks the properties to respond to the unclarity of the lives they attempt to encapsulate.
There are several ways to interpret the deconstruction of the binary. This is partly a question of considering how we use data to address what Simone Browne terms the “ontological conditions of [our] blackness” (pg. 8) — how we capture the essence of a person’s existence in data. If our existence is reduced to a set of attributes, then a techno-optimist response frames the problem as one of missing detail — increase what we capture to improve the descriptive clarity of the subject. This is an approach that Meredith Broussard neatly trounces:
“When you write the kind of computer programs that slot people into neat categories in order to do data analysis, there is a tension between people's messy, shifting identities in the "real" world that rubs up against the sleek empiricism required to do the math that is under the hood in computers. This is most obvious when it comes to the gender binary and binary representation in computer systems.” — More Than A Glitch: Confronting Race, Gender, and Ability Bias in Tech, pg. 107
Multiplying the volume and granularity of our data is not the answer. Arguing whether we strive to capture the essence of personhood in a hundred, a thousand, or a million attributes misses the point. It is the thinking behind these actions — the techno-optimist belief that the messy entanglement of human identity can be reduced to a metaphorical pixelated data landscape — that is so problematic.
It is this wrong-headed thinking that extends the tendrils of smart, predictive decision-making solutions to assist in the administration of life-changing decisions — the veracity of an asylum claim based on an applicant’s dialect, the “wrong” sentiment detected in a job applicant’s voice, the likelihood of your marriage application being selected for scrutiny based on your nationality — most often these punitive steps land on the most vulnerable in our societies. Fraud detection within social welfare is a common arena for such solutions to be deployed, with recent examples wreaking havoc in Denmark and the Netherlands.
For now, we must look past the data and focus on the thinking that instructs its collection. When the vendors of smart solutions claim to be able to calculate the probability of future behaviours based on a set of defined data attributes — many of which are intrinsic or unchangeable; and when the attributes are insufficient to burnish the perception of the system’s intelligence — then ever more esoteric data are sourced to fill these gaps. In the Netherlands, where a welfare fraud risk system was deployed in Rotterdam:
“[The welfare fraud risk score algorithm] judges people on many characteristics they cannot control (like gender and ethnicity)… The data fed into the algorithm ranges from invasive (the length of someone’s last romantic relationship) and subjective (someone’s ability to convince and influence others) to banal (how many times someone has emailed the city) and seemingly irrelevant (whether someone plays sports).” — Inside the Suspicion Machine
Far from being a data quality or data science problem, we ought to treat the misrepresentation of personhood as an ontological concern — how are we producing existence in data? We should begin by interrogating the thinking that considers personhood as subjects to fit into rigid taxonomies. We need to question if the interiority of our being can (or rather, should) ever be appraised by an algorithm.
Our belief in datafication as a way of producing existence not only signals clumsy ontological capture, but it results in the elision of a person’s fullness of being. Patricia Hill Collins (citing Zuleyma Tang Halpin) summarises this emphatically in Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment that:
“… each term in the binaries white/black, male/female, reason/emotion, culture/nature, fact/opinion, mind/body, and subject/object gains meaning only in relation to its counterpart” — Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment, pg. 19
Collins, here, is concerned with examining the function of oppressive systems — a context different but not unrelated. In this light, it necessitates having a suspicion of how oppositional binaries originate, and the oppressive power structures they hide.
Oppression functions as an ecosystem. It is a short step to recognise that taking an interest in how human and non-human entities interrelate within environmental ecosystems is not unrelated. Such parallel realms are crucial for us all to ponder. The flora, fauna, and fungi remind us of the sometimes messy, sometimes contradictory ways we co-exist. If the birds and fungi can be insurgents within the taxonomies imposed upon them, what does that tell us about the lengths we go to resist the datafication of our own kind?
A surface-level concern about the discriminatory outcomes of predictive or algorithmic decisioning systems may spur us on to advocate for creating more descriptive data to “fix” the problem1, but look closer. The histories, ideologies and power structures that equates existence with a growing array of data attributes — existing individually as binary markers, interpreted collectively as a summation of what a datafied subject might think, say, or do in the future — are aims that overlook the reality of the fluidity, unknowing and ontological instability of each subject.
“… oppositional binaries rarely represent different but equal relationships, they are inherently unstable. Tension may be temporarily relieved by subordinating one half of the binary to the other. Thus Whites rule Blacks, men dominate women, reason is thought superior to emotion in ascertaining truth, facts supersede opinion in evaluating knowledge, and subjects rule objects.” — Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment, pg. 71
The binaries embedded in our systems, and from which so much of our language derives its descriptive power, are remnants of exclusionary power structures that have determined the boundaries of inclusion and exclusion.
“... the recursive features of our Western imaginary are reliant on the reinstatement of logical "truths" made actionable by systems of categorization” — The Black Technical Object, pg. 16
Ultimately data attributes form their subjects into categories — because categorisation is political. The ability to impose a worldview by defining how data and its attributes are to be partitioned demonstrates the possession of power that can bespeak the acceptance of subjects aligned to a dominant power structure while rendering unknown other subjects that trouble the foundations of these logics.
Blackness troubles the calculative gaze. Our technological desire to know the datafied subjects is spurred by a belief that the future can be known from the data of the past. The resemblance to fortune telling and divination is not accidental, particularly when the inner workings of some algorithmic decisioning systems are scrutinised. In Blackness, we become dark matter, to borrow Simone Browne’s words; a disruptive presence in an exclusionary technological landscape: “… when dark matter troubles algorithms in this way, it amounts to a refusal of the idea of neutrality when it comes to certain technologies” (pg. 162).
The view from the margins often elicits the greatest clarity. Data feminism’s fourth principle informs us of the importance of rethinking binaries. Layered over Patricia Hill Collin’s insight, we can go further to consider the binaries that we struggle against are disguising the oppositional and oppressive structures, artificially simplifying them to uphold a dominant worldview.
Ontological capture should become our concern. This is a phrase I am experimenting with. I think it neatly summarises the concern of a digital dragnet forming as we deploy smarter AI to address sociotechnical concerns, often with life-changing outcomes; and questions why we seek to capture the essence of a person’s being in data so each body is made calculable.
“The call is a broader and more fundamental one that recognises the mutual, unfolding enactments of ordering, classifying, producing and ultimately designing technology. This collapses the us-them, human-machine, inside-outside binaries and allows us to see technology and its design not as a recapitulation of disciplinary tropes or tidy conceptual categories, but as a means of participating in unfolding ways of knowing, being and doing.“ — Out There
Thinking past binaries means letting go of our industry’s dogma that societal problems are questions that can only be answered with technological solutions. Thinking past binaries means embracing the unknowing and accepting discomfort as a settled state — which exists in contrast to the obsession with capturing the world in data to make its subjects knowable, searchable, and examinable as datafied objects.
IMAGE CREDITS
N/A-A(2023) by Merve Mepa is a sculptural work that asks searching questions on the nature of agency, predestination, and control in computerised environments.
Detail from Numbers and Faces: Multi-Racial/Ethnic Combinations Series 1: Face #11, Martina Crouch (Nigerian Igbo Tribe/White) (2020) by Charles Gaines exhibited in 2021 at the show “Multiples of Natures, Trees and Faces” (Hauser & Wirth, London); a series that questions the political and cultural ideas that shape our understanding of multi-racial identity and the way we produce difference in data
Michael Kibedi is a design researcher and writer of First & Fifteenth — a biweekly newsletter featuring his essays on human-computer interaction, conceptual art, and data justice. Michael’s design research aims to inspire audiences to adopt more critical responses to the utopian ideals prevalent in the technology industry today.
Michael has spoken at The Conference in Malmö, UNPARSED in London, UX Camp in Brighton, and Design Thinking Zeal. His writing has also been published by the Design Research Society.