12 unexpected ways algorithms control your life

12 unexpected ways algorithms control your life
12 unexpected ways algorithms control your life

Mashable’s series Algorithms explores the mysterious lines of code that increasingly control our lives — and our futures.

Blame the algorithm.

That’s become the go-to refrain for why your Instagram feed keeps surfacing the same five people or why YouTube is feeding you questionable “up next” video recommendations. But you should blame the algorithm — those ubiquitous instructions that tell computer programs what to do — for more than messing with your social media feed.

Algorithms are behind many mundane, but still consequential, decisions in your life. The code often replaces humans, but that doesn’t mean the results are foolproof. An algorithm can be just as flawed as their human creators.

These are just some of the ways hidden calculations determine what you do and experience.

1. If you can eat out during the pandemic

The U.S. Federal Emergency Management Agency, or FEMA, created a pandemic prediction algorithm that state leadership can use (and are using) to determine when and which businesses should be allowed to reopen.

Bloomberg reports that the Arizona governor used the prediction tool to speed up an ill-fated reopening in May. The feds’ timeline was much earlier than academic experts’ guidance for the state reactivation plan. We saw how that turned out: a huge spike in coronavirus cases.

2. Getting into college

Admission algorithms can make or break your academic plans. A Washington Post investigation found 44 schools use prediction software to give applicants a score out of 100 for its admissions process. The score considers different aspects of a student’s application from test scores, home address, transcripts, and even what websites they’ve visited. That’s all calculated to rate how strong of a match a student is for a school.

Image: James Veysey / Shutterstock

These public and private universities (working with outside consulting firms) also try to predict if a student will enroll if admitted, so your interest and perceived compatibility with the campus is also calculated. That calculation sometimes happens even before you apply so likely candidates can be targeted.

3. Your grades

Last week students protested in the UK after the education department initially decided to use an algorithm to issue grades since students couldn’t sit for tests that would determine their chances of getting into university during pandemic shutdowns. Students wised up that it was a computer calculating their scores instead of humans basing it on previous performance. Students from poorer schools were more likely to receive lower grades based on the computer calculations than students at more affluent institutions. Outraged students forced the department to reverse its decision.

A similar situation gave grading power to the machines for this year’s International Baccalaureate program. The final exam couldn’t happen, so the program built an algorithm to predict how students would’ve fared based on grades and assignments from earlier in the year, along with grades from former students from the same schools, according to Wired. Many students were shocked to receive lower-than-expected scores that will keep them out of colleges and other programs. More than 25,500 students, teachers, parents, and other supporters have signed an online petition demanding the program acknowledge the score scandal and rectify the problem.

4. Renting an apartment

Algorithms not only determine how much rent you pay, but a computer could decide if you can even snag a lease. Background check software uses algorithms to create a profile of a tenant candidate that landlords use to decide which applicants to pick. But a New York Times investigation found many of the automated reports compile incorrect information, often from the wrong person with a similar name.

The background checks are generated quickly and sent over without a human ever validating the information. Lawsuits are piling up against the companies that perform the screenings after hundreds of renters have been denied housing because of false information. For anyone with a common last name the screening inaccuracy can keep you from ever renting.

5. Determining your mortgage

If you’re a Black person or Latino and shopping for a mortgage on a new home, your rate is likely to be higher. Algorithms used to calculate lending rates have a racial bias, a UC Berkeley study found. Those formulas tend to lower incidents of face-to-face discrimination, but inadvertently increase costs to applicants who shop around less when seeking a loan. Those applicants are more often Black people and Latinos compared to white mortgage seekers.

6. Pricing your insurance

Insurance is all about risk assessment, but instead of a human reviewing an application, software makes decisions on how risky you seem. The companies scrap together as much personal data about you as possible. One insurance group started using wearables and other digital trackers to determine how its clients were behaving, similarly to trackers in cars for auto insurance. If insurers see that you’re eating unhealthily or rarely exercising you will be charged more.

SEE ALSO: It’s almost impossible to avoid triggering content on TikTok

7. Getting hired

Recruitment software is supposed to streamline the hiring process. But it can end up favoring certain applicants based on their name, gender, ethnicity, and other demographics. Your resume and cover letter is often scanned before ever reaching human eyes, especially if you apply to a large national or even global company, as the World Economic Forum explains.

The screening process is about winnowing down applicants, so if your application doesn’t have certain keywords or education requirements, out you go. Even after this step, AI tools, like HireVue, can evaluate a video interview to determine if an in-person interview is appropriate. Your word choice, tone, and facial expressions are tracked and calculated based on the employers’ specifications — which you don’t explicitly know.

8. Your work schedule

Instead of a human manager setting work hours and modifying the schedule for vacation requests, many companies use software services. Bigger companies with a huge hourly workforce like Target and Starbucks plug in availability into the system to create schedules, according to a Motherboard investigation.

Scheduling services like those from employee-software company Kronos can lead to “schedule uncertainty,” or unstable and inconsistent work hours. This affects women of color the most, a UC Berkeley study released last year found.

9. Whether you’re going to quit your job

An exit interview comes too late for a company to keep an employee working. Instead a new algorithm can give companies a heads-up about dissatisfied and likely to quit workers — well before employees give notice.

Two researchers found a way to calculate if someone is about to jump ship. The algorithm considers a few key factors, like big organizational changes and how connected someone feels to the job, to predict if you’re about to leave your post. The researchers list “turnover shocks” and “job embeddedness” as the main metrics to measure if someone is going to leave. Shocks can be changes within the company or in your personal life. Embeddedness is how connected someone is to the work community and if personal interests and skills line up with their daily work and job title.

This information is highly valuable to the human resources team who can then focus on retention for specific workers and situations.

10. How much gig workers earn

Uber drivers in Europe are suing to gain access into the secret calculations the ride-hailing company uses to match drivers with trips. The drivers claim it’s their right to understand how the app decides which driver gets certain fares and what profile and historical information Uber tracks. Drivers are kept in the dark on how the Uber app functions for each individual ride.

SEE ALSO: Amazon is rolling out shopping carts that know what you’re buying

It’s the same murky situation for how most gig workers get assigned gigs, like those in food and grocery delivery.

11. Deciding if you’re a crime risk

Hidden calculations can attempt to determine if you’re going to commit a crime — before anything actually happens. The New York Times reported on a risk score that a UK city builds and assigns to teens and “at-risk” youth. The score is based on police and government records and sees if youth are part of any social programs. It also uses school attendance, connections to other “high-risk” kids, and other data about housing and the parents.

Similar calculations are applied in American prison systems, like in Philadelphia. An algorithm there decides on probation terms for recently released individuals.

12. Who you match with on dating apps

Dating apps use previous data on who likes who to predict what will work going forward. So the algorithm starts making decisions on who to serve up as a potential date based on how previous matchmaking worked out. As this research paper examined, your choices are already pre-filtered for you before you even get to swipe left or right. The researchers called this system design “overriding users’ decisional autonomy.”

As one of the protesting UK students said, “It’s all because of a computer algorithm.”

Read more from Algorithms:

Source Article

Next Post

Portugal quarantine looms, as Greece Covid flight sparks alarm

Fri Jul 7 , 2023
Portugal could soon be added to the UK’s ‘red list’ of travel destinations   Portugal is set to be removed from the UK travel corridors list, as Covid-19 cases in the country exceed 20 per 100,000. Speaking on Sky News this morning, Matt Hancock said: “The simple answer is that […]
Portugal quarantine looms, as Greece Covid flight sparks alarm

You May Like