LinkedIn conducted social experiments with twenty million users over five years

Specifically for the New York Times Infobae.

LinkedIn conducted experiments with more than 20 million users over five years that, while designed to improve the way the platform works for members, may have hurt some people’s income, a new study says.

In experiments conducted globally from 2015 to 2019, LinkedIn randomly varied the ratio of weak to strong contacts suggested by its People You Might Know algorithm (the company’s automated system that recommends new connections to its users). . The evidence was detailed in a study published this month in the journal Science, co-authored by researchers from LinkedIn, the Massachusetts Institute of Technology (MIT), Stanford University and Harvard Business School.

LinkedIn’s algorithmic experiments could surprise millions of people because the company didn’t let users know the tests were happening.

Big tech companies like LinkedIn, the world’s largest professional network, routinely conduct large-scale experiments, testing versions of app features, web designs, and algorithms on different people. The long-standing practice called A/B testing aims to improve consumer experiences and retain them by helping companies monetize through premium membership fees or advertising. Users often have no idea that companies are testing them.

However, the changes made by LinkedIn show how such modifications to widely used algorithms can become social engineering experiments with potentially life-changing consequences. Experts studying the impact of computers on society said that conducting large-scale, lengthy experiments on people that could affect their job prospects in ways unseen by them raises questions about industry transparency and research oversight.

Michael Zimmer, associate professor of computer science and director of the Center for Data, Ethics and Society at Marquette University, commented, “The results show that some users had better access to job postings or a significant difference in access to job postings. These are the long-term implications that need to be considered when considering the ethics of engaging in this type of data intelligence research.”

The study in Science examined an influential theory in sociology called “The Strength of Weak Attachments,” which suggests that people are more likely to gain employment and access other opportunities through less close acquaintances than through close friends.

Researchers analyzed how changes to LinkedIn’s algorithm had impacted users’ job mobility. They found that relatively weak social ties proved to be twice as effective at securing a job on LinkedIn than stronger social ties.

In a statement, LinkedIn said it acted “in accordance” with the company’s Terms of Service and Privacy Policy and user preferences during the study. The privacy policy states that LinkedIn uses members’ personal information for research purposes. The statement added that the company used the latest and “non-invasive” sociological techniques to answer key research questions “without experimenting on members.”

LinkedIn, which is owned by Microsoft, did not directly respond to a question about how the company calculated the potential long-term impact of its experiments on users’ employment and economic status. However, the company assured that the investigation did not give some users a disproportionate advantage.

Karthik Rajkumar, an applied research scientist at LinkedIn who co-authored the study, explained that the goal of the research is “to help people at scale. Nobody was disadvantaged when looking for a job.”

Sinan Aral, a professor of management and data science at MIT and the study’s lead author, said the LinkedIn experiments are an initiative to ensure users have equal access to job opportunities.

Aral specifies: “Doing an experiment with twenty million people and then implementing a more appropriate algorithm to improve everyone’s employment prospects based on the knowledge they acquire is what they are trying to do, not grant social mobility to some people and not others” (Aral has done data analysis for The New York Times and received a Microsoft Research Grant in 2010.)

Experiments with users of large Internet companies have a patchy track record. Eight years ago, a Facebook study was published describing how the social network had secretly manipulated which posts appeared in users’ news feeds in order to analyze the spread of negative and positive emotions on its platform. The week-long experiment, conducted with 689,003 users, prompted negative reactions immediately.

LinkedIn’s networking experiments varied in intent, scope, and scope. They were developed by LinkedIn as part of the company’s ongoing effort to improve the relevance of its “People You May Know” algorithm, which suggests new connections to members.

The algorithm analyzes data such as members’ employment history, their job titles, and their connections to other users. It then attempts to measure the likelihood that a LinkedIn member will send a suggested new connection a friend invitation, and the likelihood that that new connection will accept the invitation.

For the experiments, LinkedIn adjusted its algorithm to randomly vary the prevalence of strong and weak ties that the system recommended. The study reported that the first wave of testing, conducted in 2015, “engaged more than four million subjects in the experiment.” The second wave, carried out in 2019, involved more than sixteen million people.

During testing, people who clicked the People You May Know tool and viewed the recommendations were assigned different algorithmic paths. Some of these “treatment variants,” as the study called them, caused LinkedIn users to connect more with people with whom they had weak social bonds. Other modifications caused people to form fewer connections with weak bonds.

It is unknown if most LinkedIn members are aware that they may be subject to experiments that could affect their employment opportunities.

LinkedIn’s privacy policy states that the company “may use the personal information we hold to study trends […] companies, such as B. the availability of jobs and the skills required to perform those jobs. Its policy for outside researchers who wish to analyze company data clearly states that such researchers “must not experiment or test our members.”

However, neither policy expressly informs consumers that LinkedIn itself may experiment or test its members.

In a statement, LinkedIn said, “We are transparent with our members in the research section of our Terms of Service.”

In an editorial statement, Science stated, “Our understanding and that of our reviewers was that the experiments conducted by LinkedIn were conducted in accordance with the guidelines of their Terms of Service.”

After the first wave of algorithmic tests, researchers from LinkedIn and MIT came up with the idea of ​​analyzing the results of these experiments to test the theory of the strength of weak ties. Although this decades-old theory had become a mainstay of sociology, it had not been rigorously tested in a large-scale prospective study that randomly selected individuals with varying degrees of social ties.

External researchers analyzed aggregated data from LinkedIn. The study found that people who received more referrals from moderately weak contacts generally applied for and accepted more jobs, findings consistent with weak attachment theory.

The study reported that the 20 million users involved in the LinkedIn experiments created more than 2 billion new social connections and submitted more than 70 million applications that resulted in 600,000 new jobs. The study also showed that weak ties proved most useful for job seekers in digital fields such as artificial intelligence, while strong ties proved most useful for jobs in industries less reliant on software.

LinkedIn mentioned that it has applied the weak connection learnings to several features, including a new tool that notifies members when a first- or second-degree connection is dropped. However, the company hasn’t made any study-related changes to its People You May Know feature.

Most significant about the study, according to MIT’s Aral, was that it demonstrated the importance of powerful social networking algorithms, not only for amplifying problems such as misinformation, but also as critical indicators of economic conditions such as employment and unemployment.

Catherine Flick, a senior researcher in computing and social responsibility at De Montfort University in Leicester, England, described the study more as a marketing exercise for companies.

Flick concludes: “The study is inherently biased. It shows that if you want to get more jobs, you should do more on LinkedIn.”

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *