Pages: [1]   Go Down
  Print  
Author Topic: Midjourney ends free trials of its AI image generator due to 'extraordinary' abuse  (Read 178 times)
HCK
Global Moderator
Hero Member
*****
Posts: 79425



« on: March 31, 2023, 04:05:04 pm »

Midjourney ends free trials of its AI image generator due to 'extraordinary' abuse

<p>Midjourney is putting an end to free use of its <a data-i13n="cpos:1;pos:1" href="https://www.engadget.com/ai-generated-images-from-text-cant-be-copyrighted-us-government-rules-174243933.html">AI image generator[/url] after people created high-profile deepfakes using the tool. CEO David Holz says <a data-i13n="cpos:2;pos:1" href="https://discord.com/channels/662267976984297473/942231458918064148/1090529417043918959">on Discord[/url] that the company is ending free trials due to &quot;extraordinary demand and trial abuse.&quot; New safeguards haven't been &quot;sufficient&quot; to prevent misuse during trial periods, Holz says. For now, you'll have to pay at least $10 per month to use the technology.</p><p>As The Washington Post<a data-i13n="cpos:3;pos:1" href="https://www.washingtonpost.com/technology/2023/03/30/midjourney-ai-image-generation-rules/">explains[/url], Midjourney has found itself at the heart of unwanted attention in recent weeks. Users relied on the company's AI to build deepfakes of Donald Trump <a data-i13n="cpos:4;pos:1" href="https://www.washingtonpost.com/politics/2023/03/22/trump-arrest-deepfakes/">being[/url] arrested, and Pope Francis <a data-i13n="cpos:5;pos:1" href="https://twitter.com/broderick/status/1640039713649094656">wearing[/url] a trendy coat. While the pictures were quickly identified as bogus, there's a concern bad actors might use Midjourney, OpenAI's <a data-i13n="cpos:6;pos:1" href="https://www.engadget.com/dall-e-api-apps-151625771.html">DALL-E[/url] and similar generators to spread misinformation.</p><span id="end-legacy-contents"></span><p>Midjourney has acknowledged trouble establishing policies on content. In 2022, Holz justified a ban on images of Chinese leader Xi Jinping by telling Discord users that his team only wanted to &quot;minimize drama,&quot; and that having any access in China was more important than allowing satirical content. On a Wednesday chat with users, Holz said he was having difficulty setting content policies as the AI enabled ever more realistic imagery. Midjourney is hoping to improve AI moderation that screens for abuse, the founder added.</p><p>Some developers have resorted to strict rules to prevent incidents. OpenAI, for instance, bars any images of ongoing political events, conspiracy theories and politicians. It also forbids hate, sexuality and violence. However, others have relatively loose guidelines. Stability AI won't let Stable Diffusion users <a data-i13n="cpos:7;pos:1" href="https://www.engadget.com/stable-diffusion-version-2-update-artist-styles-nsfw-work-124513511.html">copy styles or make not-safe-for-work pictures[/url], but it generally doesn't dictate what people can make.</p><p>Misleading content isn't the only problem for AI image production. There are longstanding <a data-i13n="cpos:8;pos:1" href="https://www.engadget.com/dall-e-generative-ai-tracking-data-privacy-160034656.html">concerns that the pictures are stolen[/url], as they frequently use existing images as reference points. While some companies are <a data-i13n="cpos:9;pos:1" href="https://www.engadget.com/adobe-is-bringing-generative-ai-features-to-photoshop-after-effects-and-premiere-pro-130034491.html">embracing AI art[/url] in their products, there's also <a data-i13n="cpos:10;pos:1" href="https://www.engadget.com/getty-images-ai-generated-content-ban-153337930.html">plenty of hesitation[/url] from firms worried they'll get unwanted attention.</p>This article originally appeared on Engadget at https://www.engadget.com/midjourney-ends-free-trials-of-its-ai-image-generator-due-to-extraordinary-abuse-153853905.html?src=rss

Source: Midjourney ends free trials of its AI image generator due to 'extraordinary' abuse
Logged
Pages: [1]   Go Up
  Print  
 
Jump to: