Artificial Intelligence

Datafication Defined:
"digital information that is collected, organized, and translated into new uses that can be valued or monetized…” (Wilkerson & O’Sullivan, 2023, p.10)
Principals of Data Justice:
-Respect for Human Rights
-Freedom of Expression
-Transparency
-Accountability
-Security
-Inclusivity
-Privacy
-Respect for Local Cultures
While reviewing information regarding data justice, citizenship, and literacy there are two important topics that have grabbed my attention:
​
Convenience vs. Datafication
We now live in an age where significant convenience is offered through datafication. Entertainment, commerce, education, and now health and wellness can be achieved in a virtual, in-home experience, but it comes at great risk of having our personal data compromised.
​
As it relates to my interest in infant/maternal mortality, the availability and use of telehealth directly offers solutions for at-risk families who have transportation and scheduling barriers. According to the Center of Disease Control and the Indiana Department of Health, these barriers directly contribute to perinatal risk, which is the leading cause of our unusually high and consistent infant and maternal mortality rates (IDHDFRP, 2022) (IDHDIMH). (2022).
​
However, that tradeoff of convenience for datafication exposes us to unforeseen, and unforeseeable risks as we put more and more information about ourselves into a digital world that we cannot control.
​
Algorithmic Governance, Predictive Policing, Social Credit Scores
​
The online apps and programs that we use to make our lives easier are creating both a metaphorical and literal profile of our preferences, movements, and online experiences. However, this raises the ethical question, should this profile be considered an accurate reflection of our moral character?
This idea of predictive policing is a very common theme among science fiction entertainment, but the concept is creeping into reality through the form of Social Credit Scores. This is a social concept that is being used in areas of the Chinese government to manage its citizens, some say voluntarily, and some say not. The idea is that observed social behavior, largely quantified through datafication and deemed desirable or undesirable, can enhance or diminish your personal score, which can then impact your access to travel, utilities, and education. Examples of undesirable behavior include excessive use of internet, social media posting, violent video games, and smoking in non smoking areas.
Two articles (Brussee, 2022)(Canales & Mok, 2022) that offer two different perspectives on China’s use of social credit scores can be found by clicking on the titles here:
China’s social credit score – untangling myth from reality
I find it very interesting to compare and contrast the information provided above, and reflect on the “voice” and context in which both are presented. There are a number of articles on this topic, so if this topic interests you then I encourage you to review more than just the two examples I have provided, in order to better understand the full scope of the topic.
Here in the United States, similar behavior is viewed in the court of public opinion, commonly associated with calls for shaming and boycotting business or individuals because of their public behavior (Ronson, 2022).
However, we should be very mindful about governmental and authoritarian organizations making decisions or classifications about us based on datafication being fed into an algorithm, because algorithms are subject to and greatly impacted by the implicit bias of the people who have created them.
In summary, the datafication of ourselves offers tremendous convenience and advantage, but comes at the cost of security, and in some instances can be used against us. In the interest of social justice and equity, we must be vigilant about how our society associates our digital citizenship with our actual selves, in preservation of the principals of data justice and in the interest of avoiding and breaking cycles of oppression.
References
Brussee, V. (2022, February 11). China’s Social Credit Score –
untangling myth from reality. Merics.
https://merics.org/en/comment/chinas-social-credit-score-
untangling-myth-reality
​
Canales, K., & Mok, A. (2022, November 28). China’s “Social Credit”
system ranks citizens and punishes them with throttled internet
speeds and flight bans if the Communist Party deems them
untrustworthy. Business Insider.
https://www.businessinsider.com/china-social-credit-system-
punishments-and-rewards-explained-2018-4
​
Indiana Department of Health Division of Fatality Review and
Prevention (IDHDFRP). (2022). Indiana Maternal Mortality
Review Committee 2022 Annual Report.
https://www.in.gov/health/frp/files/MMR-Report-September-
2022.pdf
​
Indiana Department of Health Division of Infant & Maternal Health
(IDHDIMH). (2022). Perinatal risks for infant deaths 2016-
2020. https://www.in.gov/health/mch/files/Final-Perinatal-
Risks-Fact-Sheet-2016-2020-.pdf
​
Ronson, J. (2022). So you’ve been publicly shamed. Picador.
D. A. Wilkerson & L. O'Sullivan (2023). Social Work in an Online
World. Washington, DC, NASW Press.
​