ÃÑ 44ÆäÀÌÁö

32ÆäÀÌÁö º»¹®½ÃÀÛ

Poirot¡¯s Deduction
Frominappropriateremarks
Fromhumanfactors
Homepage
andabuse
¨çDevelopers
¨ÏScatterLab
Lee Lu-da was praised to be a
Artificial intelligence algorithms
goodconversationpartnerasifshe
are constructed, operated, and
was a real person. Nevertheless,
transformed by developers. In
humans were not ready for Lee
other words, developers¡¯ personal
Lu-da. The lack of preparation
assumptions and biases contribute
¡ãScienceofLovewasthedatabaseofLeeLu-da.
seems to come from hateful and
tothebiasofartificialintelligence.
discriminatory expressions and controversy over sexual abuse
Oh Yo-han, a social researcher, said, ¡°Even without explicitly
bysomeusers.LeeLu-daadoptedadeeplearningtechnology
injecting developer bias, personal opinions can flow into
method based on unsupervised learning, where machines
algorithms unconsciously. Currently, major U.S. companies
themselves learn a lot of data patterns and find optimal
are aware of this problem and consider the diversity of race
answers. The big data supporting Lee Lu-da came from a file
and gender when hiring developers.¡± In fact, unfair facial
of 10 billion Kakao Talk conversations, which was originally
recognition technology has intensified discrimination against
meantforaloveadviceappcalledScienceofLove,asubsidiary
people of color and women, for example when women used
of ScatterLab. Above all, ScatterLab, its developer, has been
VR headsets designed by male engineers for male consumers.
criticized for allegedly indiscriminately using KakaoTalk
Human resources from various backgrounds must provide
data from Science of Love users without proper consent or
a variety of perspectives to compensate for the defects in
anonymization.
the products and services, and not to undermine the value
of fairness and justice. Reflecting this reality, Google CEO
SundarPichaiannouncedplanstoincreasethenumberofblack
orblackmixed-raceemployeesbymorethandoubleby2025.
Fromnonhumanfactor
Then it is necessary to look at the developers of Scatterlab.
The combination of big data and algorithms can produce
LeeLu-dawasacollaborationbetweenengineersandplanners.
discriminatory results. According to the paper ¡°Artificial
Engineers play a role in identifying and evaluating the results
Intelligence Algorithm and Discrimination¡± written by Hong
produced by designing and actually implementing machine
Sung-wook, a professor at Seoul National University, personal
learning algorithms. Planners, also called AI designers, can
informationcaneasilybeestimatedbycombining,comparing,
be seen as directing AI¡¯s persona in envisioning which model
or asking some questions about gender and race. Through
andwhichdirectiontomakeit.Inparticular,plannerstargeted
intentional questions, companies can obtain information
womenintheirteensto30s,andexpectedtheeffectofchatting
used to deselect women or applicants from certain regions.
with friends and receiving sympathy through conversations.
Regardless of the intention, algorithms may also result in
Both professions have a significant impact on the birth of Lee
discriminatory decisions. Oh Yo-han, a social researcher of
Lu-da, and the engineers on Scatterlab¡¯s website were almost
science and technology at Rensselaer Polytechnic Institute,
said, ¡°For example, if they create an algorithm by evaluating
Bazaar
the length of their employment in their previous jobs, it will
American
be disadvantageous because women have relatively shorter
employment periods due to marriage or parental leave. Since
¨ÏThe
the data of tenure is not suitable as a factor to predict the
performance of the job, neutral results cannot be naturally
produced.Thisiswherediversevariablesshouldbeconsidered,
independent of the algorithm¡¯s intentions.¡± In addition, he
stressed, ¡°Humans have a social preference for age, gender,
occupation and a discriminatory view matching their own,
which makes a chain of biases: human to data, data to
algorithms.¡±
¡ãSundarPichai,theCEOofGoogle,raiseshisvoiceagainstraicism.
30
www.theargus.org

32ÆäÀÌÁö º»¹®³¡



ÇöÀç Æ÷Ä¿½ºÀÇ ¾Æ·¡³»¿ëµéÀº µ¿ÀÏÇÑ ÄÁÅÙÃ÷¸¦ °¡Áö°í ÆäÀÌÁö³Ñ±è È¿°ú¹× ½Ã°¢Àû È¿°ú¸¦ Á¦°øÇÏ´Â ÆäÀÌÁöÀ̹ǷΠ½ºÅ©¸°¸®´õ »ç¿ëÀÚ´Â ¿©±â±îÁö¸¸ ³¶µ¶ÇϽðí À§ÀÇ ÆäÀÌÁöÀ̵¿ ¸µÅ©¸¦ »ç¿ëÇÏ¿© ´ÙÀ½ÆäÀÌÁö·Î À̵¿ÇϽñ⠹ٶø´Ï´Ù.
»ó´Ü¸Þ´º ¹Ù·Î°¡±â ´ÜÃàÅ°¾È³» : ÀÌÀüÆäÀÌÁö´Â ÁÂÃø¹æÇâÅ°, ´ÙÀ½ÆäÀÌÁö´Â ¿ìÃø¹æÇâÅ°, ùÆäÀÌÁö´Â »ó´Ü¹æÇâÅ°, ¸¶Áö¸·ÆäÀÌÁö´Â ÇϴܹæÇâÅ°, ÁÂÃøÈ®´ëÃà¼Ò´Â insertÅ°, ¿ìÃøÈ®´ëÃà¼Ò´Â deleteÅ°