Juul plans to release new Bluetooth-enabled vapes internationally

E-cigarette startup Juul Labs is planning to release Bluetooth-enabled vapes internationally, Bloomberg reports. The new tech could help lock out underage people who try to use the devices.

“We are actively evaluating new technologies and features to help keep Juul out of the hands of young people,” Juul spokesperson Victoria Davis says. The move comes as Juul faces down user lawsuits and an investigation by the Massachusetts attorney general that accuse the company of hooking young people on nicotine.

There have been hints that the company is at least considering some new tech for its products in its recent patent application, which was published in April and first spotted by Bloomberg. The application includes features that could help users keep better track of how often they’re using the device, control the strength of the dose, and change the intensity of the flavor.

The patent application also describes strategies for keeping the device locked unless the user activates it with a PIN or biometric identifier. And Bloomberg reports, based on an anonymous source, that Juul has been considering using geofences that would prevent the vapes from working at schools. But Juul wouldn’t be able to release any new tech or device modifications in the US yet because it would have to get FDA approval first.

Instead, the company plans to release the Bluetooth-connected vapes early next year “in international markets where we have launched (and per local regulations),” according to Davis. Those international markets include the UK and Israel, Davis says.

Davis couldn’t go into specifics about the new tech Juul plans to release, and she cautioned against reading too much into the patent application. “The description of a particular innovation in a patent application does not mean it is or will be implemented in any particular products or service.” But she did say in an email to The Verge that the Bluetooth-connected devices “will create the foundation that could enable a number of technological advances to help further restrict access to young people.”

23andMe and other DNA-testing firms promise not to share data without consent

A number of companies that offer consumer genetic testing, including 23andMe and Ancestry, have pledged to protect customer privacy under a new set of voluntary guidelines. The firms say they will now obtain “express consent” from customers before transferring their genetic data to third parties, and they promise to publish annual transparency reports detailing how and when their data is accessed by law enforcement.

The guidelines — which were also signed by Helix, MyHeritage, and Habit — are a reaction to public fears about how private companies share individuals’ sensitive genetic data. Customers pay for tests in the hopes of learning about their ancestry or disposition to certain diseases, but often do not consider how this information might be used by others. (Or, indeed, how hackers might try to access it.)

The issue was brought into the spotlight following the April arrest of a man who was thought to be the Golden State Killer, a serial killer and rapist who was active in the 1970s and 1980s. The suspect was identified by matching a decades-old DNA sample to a public dataset of genetic information uploaded to ancestry site GEDmatch, with police claiming that the site’s privacy policy meant a court order was not needed to search the database.

GEDmatch is not covered by these new privacy guidelines, but some firms already publish annual reports on requests from law enforcement. 23andMe received five requests last year but did not turn over any data; Ancestry received 34 and provided data in 31 cases. Under these new guidelines, the companies say they will “attempt to notify” individuals when their data is requested, although they may be blocked from doing so by court gag orders.

One area of data-sharing these best practices won’t affect is anonymized medical research. Last month, for example, 23andMe announced a partnership with GlaxoSmithKline that gives the pharmaceutical giant access to “de-identified” genetic data from the roughly 80 percent of 23andMe users who permit their information to be used for drug research. In return, 23andMe received a $300 million investment from GSK. Nothing would change with this deal under the new guidelines, although 23andMe stresses that the information shared only covers broad trends and insights and no personally identifiable data.

Although these new guidelines are only voluntary, if the companies that have signed up to the pledge break their promises, they could be the target of federal censure. Juliana Gruenwald Henderson, a spokesperson for the FTC, told The Washington Post that companies that fail to keep these promises could be fined. “The FTC remains vigilant in protecting consumers’ privacy and security,” said Henderson.

IBM’s Watson gave unsafe recommendations for treating cancer

IBM’s Watson supercomputer gave unsafe recommendations for treating cancer patients, according to documents reviewed by Stat. The report is the latest sign that Watson, once hyped as the future of cancer research, has fallen far short of expectations.

In 2012, doctors at Memorial Sloan Kettering Cancer Center partnered with IBM to train Watson to diagnose and treat patients. But according to IBM documents dated from last summer, the supercomputer has frequently given bad advice, like when it suggested a cancer patient with severe bleeding be given a drug that could cause the bleeding to worsen. (A spokesperson for Memorial Sloan Kettering said this suggestion was hypothetical and not inflicted on a real patient.)

“This product is a piece of s—,” one doctor at Jupiter Hospital in Florida told IBM executives, according to the documents. “We bought it for marketing and with hopes that you would achieve the vision. We can’t use it for most cases.”

The documents come from a presentation given by Andrew Norden, IBM Watson’s former deputy health chief, right before he left the company. In addition to showcasing customer dissatisfaction, they reveal problems with methods, too. Watson for Oncology was supposed to synthesize enormous amounts of data and come up with novel insights. But it turns out most of the data fed to it is hypothetical and not real patient data. That means the suggestions Watson made were simply based off the treatment preferences of the few doctors providing the data, not actual insights it gained from analyzing real cases.

An IBM spokesperson told Gizmodo that Watson for Oncology has “supported care for more than 84,000 patients” and is still learning. Apparently, it’s not learning the right things.