phonlamaiphoto - stock.adobe.com

HiQ Labs vs LinkedIn case OKs robot monitoring of employees

A U.S. Court of Appeals ruled on the LinkedIn vs HiQ lawsuit, which could help shape how services gather information on social media about employees.

HiQ Labs Inc. has built a business of scraping and analyzing public data on LinkedIn Corp., a business networking site owned by Microsoft. LinkedIn wanted HiQ to stop, and the two ended up in federal court.

So far, LinkedIn is losing. The U.S. Court of Appeals in the Northern District of California ruled this week that San Francisco-based HiQ can keep using its software bots to collect that data. But even if LinkedIn drops its court effort, the issue is far from settled

LinkedIn data is public, and anyone can view it. The lawsuit raises concerns about the use of software bots to automate social media monitoring. HiQ can watch for profile changes through the bots, which have gained interest from the HR community. One of its tools, Keeper, can identify employees who are a potential flight risk. HR users of the service learn of flight risk through individual risk scores.

The appeals court reaffirmed that public data on LinkedIn is not private. "There is little evidence that LinkedIn users who choose to make their profiles public actually maintain an expectation of privacy," the court said. 

In a statement, LinkedIn said it is "disappointed in the court's decision, and we are evaluating our options following this appeal." It also said that it "will continue to fight to protect our members and the information they entrust to LinkedIn." HiQ declined to comment.

What's wholly public -- and what isn't

LinkedIn told the court that the data scraping is done without the consent of its members and is a violation of the Computer Fraud and Abuse Act (CFAA), an anti-hacking law. HiQ argued the information was "wholly public" and accessible to anyone.

What authority and authorization powers should be left to the owners of the data?
Shain KhoshbinAttorney, Munck Wilson Mandala, LLP

Shain Khoshbin, an attorney at Munck Wilson Mandala, LLP in Dallas, described the court's decision as troubling. He used a physical locker as an analogy to explain why. A person could look through a locker's vents "take pictures of its contents, and analyze and sell some version of that information to others -- arguably whether or not the locker has a padlock, and even if the locker's owner sends a cease and desist letter saying stop it."

The owner of the contents of this locker "has no serious privacy expectation as to the contents of the locker," Khoshbin said.

What seems lost in all the court decisions, "is what authority and authorization powers should be left to the owners of the data?" Khoshbin said.

It's privacy vs freedom 

The case has split the opinion of Internet advocacy groups. The Electronic Privacy Information Center (EPIC) filed a brief arguing that the lower court erred. "Regrettably, the lower court discounted the privacy interests of users and required LinkedIn to make the personal data of LinkedIn users available to data aggregators for whatever purpose they wish. That cannot be correct."

But the Electronic Frontier Foundation, which also filed a brief, is pleased with the outcome. It said the CFAA law was designed to target people who hack into a computer. Allowing the LinkedIn position to prevail would give precedent for any website to bar any software bot, a move that would hurt journalists, researchers and others.

LinkedIn has the technology to stop automated software bots from collecting its member data. It has instructions in its "robots.txt" file to prohibit access to its servers via automated bots, except for the ones it wants, such as the Google search engine, which has permission from LinkedIn. Robots.txt is used to determine what bots can crawl a site. LinkedIn also had security tools to stop software bots. HiQ was fighting to keep LinkedIn from blocking access of its bots to LinkedIn's public information.

Protecting an information monopoly

Bryan Harper, manager of Schellman & Company, LLC, a global independent security and privacy compliance assessor in Tampa, Fla., said the ruling doesn't change the ability of firms to protect themselves.

"In practical terms, companies basically continue business as usual," Harper said. "If there is a malicious actor or a threat event that is captured by a monitoring tool, then a company should and has a duty to respond with their standard incident response procedures." 

But "companies should not selectively target those scraping efforts simply to protect an information monopoly," Harper said. It's also impractical to block all bots, he said.

The takeaway is "you can't target competitors" if the services rely on public data that isn't considered private by its users, Harper said.

Still, the issues may be flushed out with other lawsuits, "since this appeal involved an interim ruling on a preliminary injunction," Khoshbin said.

The appeals court made it clear that even if the CFAA does not apply, entities that view themselves as victims may still be able to raise claims under state laws, copyright infringement, misappropriation, unjust enrichment or breach of privacy, among other avenues, according to Khoshbin.

Dig Deeper on Core HR administration technology

SearchSAP
SearchOracle
Business Analytics
Content Management
Sustainability and ESG
Close