SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Strategies & Market Trends : 2026 TeoTwawKi ... 2032 Darkest Interregnum -- Ignore unavailable to you. Want to Upgrade?


To: Maurice Winn who wrote (175816)8/5/2021 11:09:34 PM
From: TobagoJack  Respond to of 217617
 
Would observe that the <<“gain of function”>> R&D tee-ed up by DARPA and subcontracted out in bits and pieces has gone viral, autonomous, and decentralized, much like a crypto network

In the meantime, the network CNN is setting an example, to show folks where when and how to behaved per CNN regime as, presumably, scientifically based. Hope the protocol works out, else the legal liabilities can be dear.

bloomberg.com

CNN Fires Three Employees for Coming to the Office Unvaccinated

Gerry Smith
August 6, 2021, 4:28 AM GMT+8
CNN fired three employees for coming to the office unvaccinated.

The cable news channel has mandated that all employees working in its offices or in the field be fully vaccinated. In a memo to staff Thursday, President Jeff Zucker said the network has “a zero-tolerance policy on this” and fired the three after learning this past week that they were coming to the office unvaccinated.

“You need to be vaccinated to come to the office,” he said. “And you need to be vaccinated to work in the field, with other employees, regardless of whether you enter an office or not. Period.”

CNN had thus far relied on an “honor system” and hasn’t required employees to show proof they’ve been inoculated. In the weeks ahead, providing evidence of vaccination may become a formal process across CNN’s parent company, AT&T Inc.’s WarnerMedia, Zucker said.

The network is also delaying employees’ return to offices from Sept. 7 to “early to mid October” given the resurgence of the pandemic, Zucker said. Most of CNN’s U.S. locations are open on a voluntary basis for employees who are fully vaccinated.

Before it's here, it's on the Bloomberg Terminal.
LEARN MORE

Sent from my iPad



To: Maurice Winn who wrote (175816)8/5/2021 11:17:36 PM
From: TobagoJack  Respond to of 217617
 
China found a TikTok analog

But not a matter of national security, just an issue of healthy education. I wonder if a domestic outfit’s app would appear and if similarly disappeared off of app stores.

bloomberg.com

China Removes Duolingo From App Stores, Targeting U.S. EdTech
Crystal Tse
August 6, 2021, 6:32 AM GMT+8
Duolingo Inc., a Pittsburgh company that makes a popular language-learning app, was removed from some app stores in China, signaling the government’s crackdown on for-profit education is extending beyond the country’s shores.

“We are working to address the issue and are hopeful that the app will be reinstated in the near term,” the company said in an emailed statement. “In the meanwhile, existing users in China can continue to use the app as they always do.”

Unverified reports of the app’s removal spread on social media Thursday, briefly sending Duolingo’s stock dipping. It ended the day with a 3% gain.

Before it's here, it's on the Bloomberg Terminal.
LEARN MORE

Sent from my iPad



To: Maurice Winn who wrote (175816)8/5/2021 11:22:52 PM
From: TobagoJack  Respond to of 217617
 
Whilst i applaud Apple’s first move but cannot help wondering it it is all going later …

bloomberg.com

Apple to Detect, Report Sexually Explicit Child Photos on iPhone

Mark Gurman
August 6, 2021, 3:00 AM GMT+8

Apple. Gadgets. The occasional basketball hot take.Sign up for the Power On newsletter and never miss a beat on Apple’s next big thing.

Sign up to this newsletter

Apple Inc. said it will launch new software later this year that will analyze photos stored in a user’s iCloud Photos account for sexually explicit images of children and then report instances to relevant authorities. The moves quickly raised concerns with privacy advocates.

As part of new safeguards involving children, the company also announced a feature that will analyze photos sent and received in the Messages app to or from children to see if they are explicit. Apple also is adding features in its Siri digital voice assistant to intervene when users search for related abusive material. The Cupertino, California-based technology giant previewed the three new features on Thursday and said they would be put into use later in 2021.

If Apple detects a threshold of sexually explicit photos of children in a user’s account, the instances will be manually reviewed by the company and reported to the National Center for Missing and Exploited Children, or NCMEC, which works with law enforcement agencies. Apple said images are analyzed on a user’s iPhone and iPad in the U.S. before they are uploaded to the cloud.

Apple said it will detect abusive images by comparing photos with a database of known Child Sexual Abuse Material, or CSAM, provided by the NCMEC. The company is using a technology called NeuralHash that analyzes images and converts them to a hash key or unique set of numbers. That key is then compared with the database using cryptography. Apple said the process ensures it can’t learn about images that don’t match the database.

The Electronic Frontier Foundation said Apple is opening a backdoor to its highly touted privacy features for users with the new tools.

“It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children,” the EFF said in a post on its website. “As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.”

Other researchers are likewise worried. “Regardless of what Apple’s long-term plans are, they’ve sent a very clear signal,” Matthew Green, a cryptography teacher at Johns Hopkins University, wrote on Twitter. “In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content.”

Critics said the moves don’t align with Apple’s “what happens on your iPhone, stays on your iPhone” advertising campaigns. “This completely betrays the company’s pious privacy assurances” wrote journalist Dan Gillmor. “This is just the beginning of what governments everywhere will demand. All of your data will be fair game. If you think otherwise, you’re terminally naive.”

Apple said its detection system has an error rate of “less than one in 1 trillion” per year and that it protects user privacy. “Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account,” the company said in a statement. “Even in these cases, Apple only learns about images that match known CSAM.”

Any user who feels their account has been flagged by mistake can file an appeal, the company said. To respond to privacy concerns about the feature, Apple published a white paper detailing the technology as well as a third-party analysis of the protocol from multiple researchers.

John Clark, president and chief executive officer of NCMEC, praised Apple for the new features.“These new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material,” Clark said in a statement provided by Apple.

The feature in Messages is optional and can be enabled by parents on devices used by their children. The system will check for sexually explicit material in photos received and those ready to be sent by children. If a child receives an image with sexual content, it will be blurred out and the child will have to tap an extra button to view it. If they do view the image, their parent will be notified. Likewise, if a child tries to send an explicit image, they will be warned and their parent will receive a notification.



Apple’s feature in Messages to analyze for explicit images.

Apple said the Messages feature uses on-device analysis and the company can’t view message contents. The feature applies to Apple’s iMessage service and other protocols like Multimedia Messaging Service.

The company is also rolling out two related features to Siri and search. The systems will be able to respond to questions about reporting child exploitation and abusive images and provide information on how users can file reports. The second feature warns users who conduct searches for material that is abusive to children. The Messages and Siri features are coming to the iPhone, iPad, Mac and Apple Watch, the company said.

(Updates with comments from privacy advocates beginning in the sixth paragraph.)

Before it's here, it's on the Bloomberg Terminal.
LEARN MORE

Sent from my iPad



To: Maurice Winn who wrote (175816)8/5/2021 11:35:47 PM
From: TobagoJack  Respond to of 217617
 
It would appear Trump did not know how to properly interact with TikTok

CCP shows the way with its equivalent in domestic arena, but not on silly national security basis

On video opium premise

I am guessing videos w/ education and such merit are okay, and silly dancing not as alright

bloomberg.com

Kuaishou Loss Deepens After Media Call for a Video Clampdown

Zheping Huang
August 6, 2021, 9:58 AM GMT+8
Kuaishou Technology plummeted almost 12% after an influential state-backed newspaper urged tighter regulation of internet video content, the latest in a string of pronouncements from government-controlled media calling for a crackdown on online industries.

The company slid to a low of HK$78.60 in Hong Kong, adding to Thursday’s 15% wipeout, after the Communist Party mouthpiece People’s Daily said in a commentary that Beijing should step up oversight of online platforms, particularly the way that anonymous social media users can band together to promote potentially undesirable content.

Kuaishou fell the most on record Thursday after a post-listing lockup on sales of its shares expired, underscoring the extent of investors’ fears about a widening Chinese online crackdown. Kuaishou joins rivals like Tencent Holdings Ltd. in a widespread market selloff this week, a wave of exits triggered by mounting uncertainty over the extent, direction and severity of Beijing’s widening clampdown on a plethora of online sectors.

Beijing’s campaign to rein in its giant internet industry is entering its 10th month, a roller-coaster ordeal that’s prompting nervous investors to ponder the longer-term ramifications of a crackdown on firms from Jack Ma’s Ant Group Co. and Alibaba Group Holding Ltd. to food delivery giant Meituan and ride-hailing leader Didi Global Inc.

Read more: China TikTok-Rival Kuaishou Craters, Widens China Tech Rout

Those actions demonstrated Beijing’s resolve to go after private enterprises to address social inequities, seize control of data it deems crucial to the economy and stability, and rein in powerful interests. Almost unnoticed amid a flurry of reports this week about gaming addiction was a Xinhua News Agency report outlining how regulators will soon step up oversight of how online media employ algorithms to promote content -- a key aspect of services operated by Kuaishou and ByteDance Ltd., among others.

Internet platforms should make efforts to promote a “healthy fan culture” among youths because insufficient oversight has encouraged improper activities, the People’s Daily wrote Friday. Some websites use algorithms and big data to create idols, then encourage fans to spend money, the newspaper said without naming any services.

(Updates with latest move from the first paragraph)

Before it's here, it's on the Bloomberg Terminal.
LEARN MORE

Sent from my iPad