Take a fresh look at your lifestyle.

- Advertisement -

Analysis | AI-generated art sounds alarming, but it doesn’t have to be

Just a few months ago, the concept of using artificial intelligence to generate unique works of art seemed cutting edge and futuristic. Soon it will be as mundane as doing a Google search.

Microsoft Corp. announced this week that it is leveraging its $1 billion investment in OpenAI, an artificial intelligence research firm, and bringing that company’s standout AI service to Microsoft 365, the company’s flagship bundle of software services. Powered by OpenAI’s DALL-E 2 AI technology, Microsoft Designer generates any image users type into a box, such as “cake with berries, bread and pastries for fall”.

It’s a quick step forward for DALL-E 2, which was first announced only six months ago. While the Designer app is currently only available in beta, the rollout underscores how quickly art-generating AI has evolved, to the extent that artists have expressed concerns. Some stage names are especially common as text prompts in similar art generators, which worries some about what the technology will do to their careers. AI ethicists are also concerned about a deluge of new fake images hitting the web, fueling disinformation campaigns.

Still, Microsoft’s involvement in this area is good news. The company is joining OpenAI’s limited rollout of DALL-E 2, as well as its strict rules about the types of images it will generate. For example, DALL-E 2 prohibits images with explicit sexual and violent content and does so by simply removing such images from the database of images used to train the model. Microsoft has said it will use similar filters.

Microsoft also said it would block text prompts on “sensitive topics,” which it did not address, but which again most likely reflect DALL-E 2’s policies for questions related to things like politics or illegal activity, or images of well-known figures. figures such as politicians or celebrities.

There has been some wringing among tech ethicists that open-source versions of this kind of technology, such as a tool released in August by British startup Stability AI, will lead to free-for-all fake content that will infect social networks. . networks and disrupt upcoming elections (think fake images of Joe Biden or Donald Trump in controversial situations).

But a carefully curated version of Microsoft’s technology seems to temper that prospect for two reasons. First, opportunistic photo forgers are more likely to find their efforts hampered by the filters built into the technology. As more people use such tools, the general public will also become more aware that photos on the web can be generated by AI.

It is extraordinary that this form of creative artificial intelligence is moving so fast and that Microsoft’s Designer tool will soon be standing next to the established values ​​of business software such as Word, Outlook and Excel. This, as some have noted, like illustrations on steroids, is only limited by a user’s imagination.

It also underscores how difficult it can be to predict which direction artificial intelligence will take. A few years ago, tech experts widely expected that we would have self-driving trucks and cars on the road that would reduce accidents and put human drivers out of work. Now it is artists and illustrators who have more cause for concern, although the nature of their work may simply change. As art generation becomes within the fingertips of millions of people, they will have to be flexible.

More from Bloomberg Opinion:

Cybersecurity Needs Its Own Sarbanes-Oxley: Tim Culpan

OpenAI project risks bias without more scrutiny: Parmy Olson

Taylor Swift Will Always Be Greater Than AI: Tyler Cowen

This column does not necessarily reflect the views of the editors or Bloomberg LP and its owners.

Parmy Olson is a Bloomberg Opinion columnist on technology. She is a former reporter for the Wall Street Journal and Forbes and the author of “We Are Anonymous.”

More stories like this are available at bloomberg.com/opinion

Leave A Reply

Your email address will not be published.