HACKINTOSH.ORG | Macintosh discussion forums

Macintosh News => Apple News => Topic started by: HCK on January 30, 2021, 04:05:09 pm



Title: Tim Cook Implies That Facebook's Business Model of Maximizing Engagement Leads to Polarization and Violence
Post by: HCK on January 30, 2021, 04:05:09 pm
Tim Cook Implies That Facebook's Business Model of Maximizing Engagement Leads to Polarization and Violence

Apple CEO Tim Cook today spoke at the virtual Computers, Privacy, and Data Protection conference, condemning the business model of companies like Facebook and emphasizing Apple's commitment to advancing user privacy.





(https://images.macrumors.com/article-new/2021/01/tim-cook-privacy-conference.jpg)


"At a moment of rampant disinformation and conspiracy theories juiced by algorithms, we can no longer turn a blind eye to a theory of technology that says all engagement is good engagement — the longer the better — and all with the goal of collecting as much data as possible," said Cook. "It is long past time to stop pretending that this approach doesn't come with a cost — of polarization, of lost trust and, yes, of violence," he added.





Cook highlighted two recent privacy measures that Apple has taken, including privacy labels in the App Store and App Tracking Transparency, which will require apps to request permission to track users starting with the next iOS 14, iPadOS 14, and tvOS 14 betas (https://www.macrumors.com/2021/01/28/app-tracking-transparency-required-soon/). Apple says the software updates will be released in the early spring.





On an earnings call yesterday, Facebook CEO Mark Zuckerberg said Apple's privacy claims are often misleading and self serving (https://www.macrumors.com/2021/01/27/apple-privacy-changes-anti-competitive-facebook/):
Apple has every incentive to use their dominant platform position to interfere with how our apps and other apps work, which they regularly do to preference their own. And this impacts the growth of millions of businesses around the world.





Including -- with the upcoming iOS 14 changes, many small businesses will no longer be able to reach their customers with targeted ads. Now, Apple may say that they're doing this to help people, but the moves clearly track their competitive interests.
Today is Data Privacy Day, and Apple has marked the occasion by sharing "A Day in the Life of Your Data (https://www.macrumors.com/2021/01/28/apple-a-day-in-the-life-of-your-data/)," an easy-to-understand PDF report that explains how third-party companies track user data across websites and apps, highlights Apple's privacy principles, and provides more details about App Tracking Transparency.





Cook's remarks can be listened to in this YouTube video starting at the 3:50 mark:





<div class="center-wrap"><iframe width="560" height="315" src="https://www.youtube.com/embed/zjP9JYeAS5s" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></div>


A full transcript of Cook's prepared remarks is available below.
Good afternoon.





John, thank you for the generous introduction and for hosting us today.





It's a privilege to join you — and to learn from this knowledgeable panel — on this fitting occasion of Data Privacy Day.





A little more than two years ago, joined by my good friend, the much-missed Giovanni Buttarelli, and Data Protection regulators from around the world, I spoke in Brussels about the emergence of a data-industrial complex.





At that gathering we asked ourselves: “what kind of world do we want to live in?"





Two years later, we should now take a hard look at how we've answered that question.





The fact is that an interconnected ecosystem of companies and data brokers, of purveyors of fake news and peddlers of division, of trackers and hucksters just looking to make a quick buck, is more present in our lives than it has ever been.





And it has never been so clear how it degrades our fundamental right to privacy first, and our social fabric by consequence.





As I've said before, “if we accept as normal and unavoidable that everything in our lives can be aggregated and sold, then we lose so much more than data. We lose the freedom to be human."





And yet this is a hopeful new season. A time of thoughtfulness and reform. And the most concrete progress of all is thanks to many of you.





Proving cynics and doomsayers wrong, the GDPR has provided an important foundation for privacy rights around the world, and its implementation and enforcement must continue.





But we can't stop there. We must do more. And we're already seeing hopeful steps forward worldwide, including a successful ballot initiative strengthening consumer protections right here in California.





Together, we must send a universal, humanistic response to those who claim a right to users' private information about what should not and will not be tolerated.





As I said in Brussels two years ago, it is certainly time, not only for a comprehensive privacy law here in the United States, but also for worldwide laws and new international agreements that enshrine the principles of data minimization, user knowledge, user access and data security across the globe.





At Apple, spurred on by the leadership of many of you in the privacy community, these have been two years of unceasing action.





We have worked to not only deepen our own core privacy principles, but to create ripples of positive change across the industry as a whole.





We've spoken out, time and again, for strong encryption without backdoors, recognizing that security is the foundation of privacy.





We've set new industry standards for data minimization, user control and on-device processing for everything from location data to your contacts and photos.  





At the same time that we've led the way in features that keep you healthy and well, we've made sure that technologies like a blood-oxygen sensor and an ECG come with peace of mind that your health data stays yours.





And, last but not least, we are deploying powerful, new requirements to advance user privacy throughout the App Store ecosystem.





The first is a simple but revolutionary idea that we call the privacy nutrition label.





Every app — including our own — must share their data collection and privacy practices, information that the App Store presents in a way every user can understand and act on.





The second is called App Tracking Transparency. At its foundation, ATT is about returning control to users — about giving them a say over how their data is handled.





Users have asked for this feature for a long time. We have worked closely with developers to give them the time and resources to implement it. And we're passionate about it because we think it has the great potential to make things better for everybody.





Because ATT responds to a very real issue.





Earlier today, we released a new paper called “A Day in the Life of Your Data." It tells the story of how apps that we use every day contain an average of six trackers. This code often exists to surveil and identify users across apps, watching and recording their behavior.





In this case, what the user sees is not always what they get.





Right now, users may not know whether the apps they use to pass the time, to check in with their friends, or to find a place to eat, may in fact be passing on information about the photos they've taken, the people in their contact list, or location data that reflects where they eat, sleep or pray.





As the paper shows, it seems that no piece of information is too private or personal to be surveilled, monetized, and aggregated into a 360-degree view of your life. The end result of all of this is that you are no longer the customer, you're the product.





When ATT is in full effect, users will have a say over this kind of tracking.





Some may well think that sharing this degree of information is worth it for more targeted ads. Many others, I suspect, will not, just as most appreciated it when we built a similar functionality into Safari limiting web trackers several years ago.





We see developing these kinds of privacy-centric features and innovations as a core responsibility of our work. We always have, we always will.





The fact is that the debate over ATT is a microcosm of a debate we have been having for a long time — one where our point of view is very clear.





Technology does not need vast troves of personal data, stitched together across dozens of websites and apps, in order to succeed. Advertising existed and thrived for decades without it. And we're here today because the path of least resistance is rarely the path of wisdom.





If a business is built on misleading users, on data exploitation, on choices that are no choices at all, then it does not deserve our praise. It deserves reform.





We should not look away from the bigger picture.





At a moment of rampant disinformation and conspiracy theories juiced by algorithms, we can no longer turn a blind eye to a theory of technology that says all engagement is good engagement — the longer the better — and all with the goal of collecting as much data as possible.





Too many are still asking the question, “how much can we get away with?," when they need to be asking, “what are the consequences?"





What are the consequences of prioritizing conspiracy theories and violent incitement simply because of their high rates of engagement?





What are the consequences of not just tolerating, but rewarding content that undermines public trust in life-saving vaccinations?





What are the consequences of seeing thousands of users join extremist groups, and then perpetuating an algorithm that recommends even more?





It is long past time to stop pretending that this approach doesn't come with a cost — of polarization, of lost trust and, yes, of violence.





A social dilemma cannot be allowed to become a social catastrophe.





I think the past year, and certainly recent events, have brought home the risk of this for all of us — as a society, and as individuals as much as anything else.





Long hours spent cooped up at home, the challenge of keeping kids learning when schools are closed, the worry and uncertainty about what the future would hold, all of these things threw into sharp relief how technology can help — and how it can be used to harm.





Will the future belong to the innovations that make our lives better, more fulfilled and more human?





Or will it belong to those tools that prize our attention to the exclusion of everything else, compounding our fears and aggregating extremism, to serve ever-more-invasively-targeted ads over all other ambitions?





At Apple, we made our choice a long time ago.





We believe that ethical technology is technology that works for you. It's technology that helps you sleep, not keeps you up. That tells you when you've had enough, that gives you space to create, or draw, or write or learn, not refresh just one more time. It's technology that can fade into the background when you're on a hike or going for a swim, but is there to warn you when your heart rate spikes or help you when you've had a nasty fall. And that all of this, always, puts privacy and security first, because no one needs to trade away the rights of their users to deliver a great product.





Call us naive. But we still believe that technology made by people, for people, and with people's well-being in mind, is too valuable a tool to abandon. We still believe that the best measure of technology is the lives it improves.





We are not perfect. We will make mistakes. That's what makes us human. But our commitment to you, now and always, is that we will keep faith with the values that have inspired our products from the very beginning. Because what we share with the world is nothing without the trust our users have in it.





To all of you who have joined us today, please keep pushing us all forward. Keep setting high standards that put privacy first. And take new and necessary steps to reform what is broken.





We've made progress together, and we must make more. Because the time is always right to be bold and brave in service of a world where, as Giovanni Buttarelli put it, technology serves people, and not the other way around.





Thank you very much.
<div class="linkback">Tags: Tim Cook (https://www.macrumors.com/guide/tim-cook/), privacy (https://www.macrumors.com/guide/privacy/)</div>
This article, &quot;Tim Cook Implies That Facebook's Business Model of Maximizing Engagement Leads to Polarization and Violence (https://www.macrumors.com/2021/01/28/tim-cook-speaks-at-data-protection-conference/)&quot; first appeared on MacRumors.com (https://www.macrumors.com)

Discuss this article (https://forums.macrumors.com/threads/tim-cook-implies-that-facebooks-business-model-of-maximizing-engagement-leads-to-polarization-and-violence.2282315/) in our forums

<div class="feedflare">
<img src="http://feeds.feedburner.com/~ff/MacRumors-Front?d=yIl2AUoC8zA" border="0"></img> (http://feeds.macrumors.com/~ff/MacRumors-Front?a=ToLW0ZxwLm0:30bMCYYJCX0:yIl2AUoC8zA) <img src="http://feeds.feedburner.com/~ff/MacRumors-Front?d=6W8y8wAjSf4" border="0"></img> (http://feeds.macrumors.com/~ff/MacRumors-Front?a=ToLW0ZxwLm0:30bMCYYJCX0:6W8y8wAjSf4) <img src="http://feeds.feedburner.com/~ff/MacRumors-Front?d=qj6IDK7rITs" border="0"></img> (http://feeds.macrumors.com/~ff/MacRumors-Front?a=ToLW0ZxwLm0:30bMCYYJCX0:qj6IDK7rITs)
</div><img src="http://feeds.feedburner.com/~r/MacRumors-Front/~4/ToLW0ZxwLm0" height="1" width="1" alt=""/>

Source: Tim Cook Implies That Facebook's Business Model of Maximizing Engagement Leads to Polarization and Violence (https://www.macrumors.com/2021/01/28/tim-cook-speaks-at-data-protection-conference/)