Siri Archives – We Got This Covered 3m5c4f All the latest news, trailers, & reviews for movies, TV, celebrities, Marvel, Netflix, anime, and more. Sun, 05 Jan 2025 17:02:24 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://wegotthiscovered.play1002.com/wp-content/s/2022/04/WGTC_Favicon2.png?w=32 Siri Archives – We Got This Covered 3m5c4f 32 32 210963106 Can I fill out the Apple class action lawsuit form online and claim the settlement money? 6es6 https://wegotthiscovered.play1002.com/fyi/can-i-fill-out-the-apple-class-action-lawsuit-form-online-and-claim-the-settlement-money/ https://wegotthiscovered.play1002.com/fyi/can-i-fill-out-the-apple-class-action-lawsuit-form-online-and-claim-the-settlement-money/#respond <![CDATA[Marco Vito Oddo]]> Sun, 05 Jan 2025 17:02:17 +0000 <![CDATA[FYI]]> <![CDATA[Tech]]> <![CDATA[Apple]]> <![CDATA[Siri]]> https://wegotthiscovered.play1002.com/?p=1819570 <![CDATA[
Apple's alleged misuse of Siri has spanned into a $95 million settlement.]]>
<![CDATA[

Apple has recently agreed to a multimillion-dollar settlement that could affect tens of millions of Siri across the United States, and you might be one of them. 4y4d5i

Siri, introduced by Apple in 2011, revolutionized how people interact with their devices. As a voice-activated virtual assistant, Siri is designed to help s perform tasks, answer questions, and control their devices through natural language commands. s typically activate Siri by saying “Hey Siri” or pressing a designated button, after which the assistant listens for commands and responds accordingly. This technology has become deeply integrated into Apple’s ecosystem, appearing in iPhones, iPads, Mac computers, Apple Watches, HomePod speakers, and Apple TVs.

However, the convenience of voice-activated assistance comes with privacy implications. The lawsuit, filed in Oakland, California federal court, centers on allegations that Apple’s Siri voice assistant recorded private conversations without s’ consent. According to court documents, the unauthorized recordings occurred when Siri was inadvertently activated, capturing personal discussions that s believed were private. More concerning, these recordings were allegedly shared with third parties, including rs, leading to targeted ments based on private conversations.

s reported receiving targeted ments for products they had only discussed verbally near their devices. For instance, two plaintiffs in the lawsuit reported seeing ads for Air Jordan sneakers and Olive Garden restaurants shortly after mentioning these brands in private conversations. Another plaintiff claimed to receive ments for a specific surgical treatment after discussing it privately with their doctor. Finally, a whistleblower revealed to The Guardian that accidental activations were common, with something as simple as a zipper sound potentially triggering Siri.

The proposed $95 million settlement comes after a five-year legal battle initiated by the 2019 report from The Guardian, which revealed that Apple’s third-party contractors regularly heard confidential information. While Apple has maintained its denial of any wrongdoing, the company has agreed to settle the case, potentially offering compensation to customers who owned Siri-enabled devices between September 2014 and December 2024.

How and when can you submit your Siri settlement claim? 616948

Currently, you cannot fill out the Apple class action lawsuit form online. The settlement is still pending preliminary approval from U.S. District Judge Jeffrey White in the Oakland federal court, and no claim submission process has been established yet. Once the settlement receives court approval, which could take several weeks or months, eligible individuals will likely be able to submit their claims through an official settlement website. The settlement is expected to compensate qualifying s with up to $20 per Siri-enabled device, with claimants able to submit claims for up to five devices. Granted, it’s not much, but this value could increase should the settlement be refused to pressure Apple to adequately compensate its s.

To qualify for the settlement, s must confirm under oath that they experienced Siri listening to a “conversation intended to be private” without specific activation. Eligible devices include iPhones, iPads, HomePod speakers, Mac computers, Apple Watches, and Apple TVs purchased during the specified period. Beyond monetary compensation, the settlement require Apple to confirm the deletion of Siri audio recordings collected before October 2019 and create a webpage better explaining their opt-in “Improve Siri” program. However, potential claimants should note that accepting settlement money means forfeiting their right to sue Apple for related claims in the future.

For now, interested parties should stay informed about the settlement’s progress through official channels. Once the court grants preliminary approval, details about the claim submission process will be made public, allowing affected s to seek their portion of the settlement. The preliminary approval hearing is scheduled for Feb. 14, 2025, after which more specific information about the claims process should become available.

]]>
https://wegotthiscovered.play1002.com/fyi/can-i-fill-out-the-apple-class-action-lawsuit-form-online-and-claim-the-settlement-money/feed/ 0 1819570
‘Why would Apple do this?’ 6t384p The discovery of all the things Siri controls comes far too late for one unfortunate soul https://wegotthiscovered.play1002.com/social-media/why-would-apple-do-this-the-discovery-of-all-the-things-siri-controls-comes-far-too-late-for-one-unfortunate-soul/ https://wegotthiscovered.play1002.com/social-media/why-would-apple-do-this-the-discovery-of-all-the-things-siri-controls-comes-far-too-late-for-one-unfortunate-soul/#respond <![CDATA[Nahila Bonfiglio]]> Tue, 23 Jul 2024 20:05:19 +0000 <![CDATA[News]]> <![CDATA[Social Media]]> <![CDATA[Apple]]> <![CDATA[Siri]]> <![CDATA[TikTok]]> https://wegotthiscovered.play1002.com/?p=1728518 <![CDATA[
"Tim Cook? More like don’t let Tim Cook."]]>
<![CDATA[

Our technologies are slowly taking over our lives, from the ever-present necessity of the mini-computers in our back pockets to the rise of smart TVs, smart devices, and even full-blown smart homes. 

AI assistants are also becoming more and more prevalent in our increasingly technological world, and two supreme options are vying for the crown. Amazon’s Alexa has been slowly honing its approach, graduating from glorified DJ to full-blown virtual employee, managing schedules, operating smart homes, and even doing our shopping for us. 

Siri is much the same — some would argue even better — but, as with anything, the Apple staple has a few shortcomings. One of which was recently flagged by TikTok  David Seung, a comedian who stumbled across a hilarious but inconvenient gap in the virtual assistant’s capabilities. 

Noting that he’s been leaning on Siri’s “Korean man” voice to help teach him Korean — no, it isn’t working — Seung plunged into a hilarious explanation of exactly why you shouldn’t follow in his footsteps. For one thing, his reliance on Korean man Siri is really only resulting in missed exits — it’s hard to follow directions in a language you don’t speak — but more importantly, asg Siri a new voice has some unforeseen consequences.

Most notably, it turns out the virtual assistant fills in any gaps you’ve missed in your cellular experience. And, if you’ve failed to create your own voicemail message, that duty lands in Siri’s lap. 

Which, at least in Seung’s case, leads to a very confusing, nearly unintelligible outgoing voicemail message. Due to his decision to tap Korean man Siri as his virtual assistant of choice, Korean man Siri steps in to fulfill the task, and tries “his best in English.” 

@sunnyd.o.g

A PSA for anyone using Siri in any voice other than the default. Don’t let Tim Cook. #standupcomedy #siri #korea #korean #iphone #iphonetricks #comedyvideo #apple

♬ original sound – David Seung Comedy

The result is a hilariously garbled message, as Korean man Siri attempts to inform callers that the person they’re trying to reach is not available and invites them to record a message after the beep. For an AI that doesn’t speak English, it’s actually incredible, but for anyone attempting to leave a message, it’s probably all but impossible to understand.

Noting that he uses “this phone for work,” Seung laughingly asks the powers that be, “Why would Apple do this?” It’s a great question, and one that viewers shared once they got an earful of Korean man Siri’s shoddy attempt at an outgoing voicemail.

With numerous commenters noting that “he’s just trying his best,” there’s more praise than criticism for the AI’s valiant attempt in the comment section. “Why does Korean Siri sound so wholesome though,” one commenter asked. “I want to do this just to confuse people but don’t want to miss my exit,” another added.

That ambition is common among commenters, as is distress over the discovery. People everywhere learned, thanks to Seung, that their own outgoing voicemails are similarly AI-created, and the revelation is answering a lot of questions. 

Issues like the one faced by Seung and so many commenters are increasingly common in this day and age, but they’re also nothing new. After all, virtual assistants have been around since the mid-60s. They were certainly a far cry from what we’re working with today, but those early attempts paved the way for Alexa and Siri. For decades we’ve been slowly honing the approach, and in the early 2010s we got our first glimpse of the dazzling AI that would become Siri and Alexa. They’ve evolved massively from those early days, but virtual assistants are actually a tried-and-true staple of technology. 

There’s still a long way to go, however, as Seung’s video makes clear. It’s a hilarious gap in Siri’s capabilities, and one that most certainly led to plenty of confusion, but it’s also quickly becoming a feature rather than a bug in the eyes of viewers. Since Millennials hate picking up the phone so much, we may as well make the voicemail process a little more fun for the people we’re ignoring.

]]>
https://wegotthiscovered.play1002.com/social-media/why-would-apple-do-this-the-discovery-of-all-the-things-siri-controls-comes-far-too-late-for-one-unfortunate-soul/feed/ 0 1728518
‘Siri’s like 6x5c65 Not on my watch… literally’: Woman bemoaning current situation gets an unhelpful suggestion from Siri https://wegotthiscovered.play1002.com/social-media/siris-like-not-on-my-watch-literally-woman-bemoaning-current-situation-gets-an-unhelpful-suggestion-from-siri/ https://wegotthiscovered.play1002.com/social-media/siris-like-not-on-my-watch-literally-woman-bemoaning-current-situation-gets-an-unhelpful-suggestion-from-siri/#respond <![CDATA[Jordan Collins]]> Mon, 11 Sep 2023 15:52:32 +0000 <![CDATA[News]]> <![CDATA[Social Media]]> <![CDATA[Apple]]> <![CDATA[Siri]]> <![CDATA[TikTok]]> <![CDATA[TikToker]]> https://wegotthiscovered.play1002.com/?p=1584294 <![CDATA[
AI doesn't get Gen Z or Millennial humor.]]>
<![CDATA[

We all know our phones and smart watches are constantly listening to every word we say, but now TikTok has shown that they’ve started offering their unwanted advice too.

The beginning of a TikToker’s rant was cut hilariously short by Siri. In a video shared by allieploense, we see her sitting in a car, clearly about to burst into what could have been a legendary rant about the struggles of life, work, or something else entirely. “Oh, god. I don’t wanna be here. Hmm. Not a slay.”

Whatever it was allieploense was about to say, it sounds like we all could have related to it. However, Siri decided that she had heard enough, expressing concern for the TikToker’s mental health as well as offering to call the National Suicide Prevention Hotline.

Siri’s intervention is both incredibly dark and hilarious considering it’s pretty clear that the TikToker wasn’t being serious with her statement. Whilst many are scared of AI taking our jobs, it’s somewhat comforting to think that robots are still struggling to understand the concept of hyperbole and viewers poked fun at the Apple assistant.

“….ok🫤” she sounded so backhanded

Siri’s like: not on my watch… literally.

Others shared similar stories in which Siri offered some unwanted “help.”

This happened to me at gym when I said “I think I’m dying” on the stair climber

One time.. she did that to me… while I was singing 😭 guess it was that bad

Anyway, as funny as the whole situation is, at least Siri’s heart was in the right place and maybe we should be glad that, right now, robots are looking out for us. If movies have taught us anything, it’s that this might not be the case forever.

If you or someone you know may be considering suicide, the National Suicide Prevention Lifeline at 1-800-273-8255 (En Español: 1-888-628-9454; Deaf and Hard of Hearing: 1-800-799-4889) or the Crisis Text Line by texting HOME to 741741. A list of international crisis resourcescan be found here.

]]>
https://wegotthiscovered.play1002.com/social-media/siris-like-not-on-my-watch-literally-woman-bemoaning-current-situation-gets-an-unhelpful-suggestion-from-siri/feed/ 0 1584294
Susan Bennett’s net worth and how much she makes from voicing Apple’s Siri 13k5q https://wegotthiscovered.play1002.com/celebrities/susan-bennetts-net-worth-and-how-much-she-makes-from-voicing-apples-siri/ https://wegotthiscovered.play1002.com/celebrities/susan-bennetts-net-worth-and-how-much-she-makes-from-voicing-apples-siri/#respond <![CDATA[Chynna Wilkinson]]> Tue, 18 Jul 2023 15:58:32 +0000 <![CDATA[Celebrities]]> <![CDATA[Tech]]> <![CDATA[Apple]]> <![CDATA[Siri]]> <![CDATA[Susan Bennett]]> https://wegotthiscovered.play1002.com/?p=1538229 <![CDATA[
All the credit and none of the moola.]]>
<![CDATA[

You might not know the name Susan Bennett, but you’ll absolutely know the name Siri. Back in October 2011, when Apple released the first-ever iPhone, Bennett provided the voice for the company’s virtual assistant.

Siri’s original American, British, and Australian voice actors — which included Bennett — recorded their respective voices around 2005 for ScanSoft, unaware that the recordings would eventually be used for Siri’s debut. Additionally, Siri originated as an app available in Apple’s iOS App Store, but was eventually integrated into the iPhone 4S. Ever since, Siri has evolved along with every year’s newest iPhone release.

Susan Bennett was born in Burlington, Vermont, in 1949. Not only is she one of the world’d most famous voice actors, Bennett also worked as a backup singer for Burt Bacharach and Roy Orbison. Bennett’s voice has been used for numerous purposes, such as the public address system in all Delta Air Lines terminals worldwide, GPS navigation software, and several company commercials, including McDonald’s, Coca-Cola, and Ford.

As of June 1, 2023, Bennett has a net worth of approximately $5 Million. She provided the voice of Siri until the iOS 7 update in September 2013. Contrary to popular belief, Bennett was never compensated for her work, as Apple used her pre-made recordings without her prior knowledge.

Bennett doesn’t seem to mind all that much, however, as she told Insider she’s “enjoyed being Siri.” In the same interview, Bennett spills the beans on how Siri came to be.

“I got a gig to record for ScanSoft, an interactive voice response company, now called Nuance. I thought the script would consist of regular sayings, like ‘Thanks for calling,’ or ‘Please dial one.’ Instead, I had to read nonsensical sentences like ‘Cow hoist in the tug hut today’ or ‘Say shift fresh issue today’ — they were trying to get all of the sound combinations in the English language. They also had me read the names of addresses and streets.”

She also speaks out on the mistreatment of voice actors, especially the voices behind AI systems.

“I think a lot of people don’t even consider that there are human beings behind AI voices, or that a real person recorded it and deserves to be paid.”

Siri’s voice has been updated several times over the years, so Bennett can finally ask Siri her burning questions without immediately cringing.

]]>
https://wegotthiscovered.play1002.com/celebrities/susan-bennetts-net-worth-and-how-much-she-makes-from-voicing-apples-siri/feed/ 0 1538229