Art institutes getting creative tackling AI

0
898


Rick Dakan was teaching a comic book class last year with illustration faculty—including a professor who helped draw the famed comic Garfield—when a colleague arrived late because they had been looking at Midjourney, the generative AI platform that creates art with a simple prompt.

“I remember cornering the fine arts chair, saying, ‘Have you seen this?’” said Dakan, now chair of the AI Task Force at the Ringling College of Art and Design in Sarasota, Fla. “There was this existential dread about it.”

A year later, leaders at some of the top art institutes in the country view artificial intelligence as a next step in the digitization of the art world. Ethical concerns from faculty and students are slowing AI adoption, but the institutes leaning into the technology view it as a potential tool.

“This is as organic and logical as it gets, that an art institution is exactly primed to integrate this,” said Griffin Smith, a lecturer at the Rhode Island School of Design. “RISD is the most punk art school, and I see this as one of the greatest punk opportunities we’ve had.”

AI is being integrated into art institutions in ways similar to other colleges across the nation—in coding courses and creative writing classes—but there’s an added discussion about what this means for the artists and art as a whole.

A Mixed Reception

When generative artificial intelligence catapulted onto the art scene last fall, Jane South saw beyond copyright concerns, ethical conundrums and warnings about the potential downfall of the entire art community.

“I thought, ‘Goody, goody,’” said South, the chair of fine arts at the Pratt Institute. The coming of AI “is one of those moments—[like] when the printing press was invented, when photography came along—that art thrives on. Because it makes us think about what it is we do.”

At RISD, students embraced the technology much more quickly than faculty.

“The shock of the language tool … you can feel it go away overnight,” Smith said. “We’ve accepted it as a given—well, the students did that, but the faculty have a much harder time wrapping their heads around it. [To them] it felt like a crisis, but it doesn’t feel like a creative opportunity. Which I think is an absurd and irresponsible opinion.”

At Ringling, some faculty are beginning to come around to the idea, despite a moral opposition.

“Even the most angry illustration faculty have said, ‘I hate it, I wish we could go traditional, but if you’re a student today you would be an idiot if you didn’t learn this before you go into the world,’” Dakan said. “It will be part of your career.”

Art, AI and Policy

The Rhode Island School of Design addresses AI on a faculty-led, class-by-class basis rather than with a blanket policy. RISD sent an email in the spring reminding faculty members that students have to do their own work, and the rest was up to the faculty.

The email said, “‘You can ban it or even encourage them,’ like that was some crazy idea,” said Smith, who teaches the classes Art and AI and Teaching and Writing With AI at RISD. He added he believes most faculty members allow their students to use AI if properly cited. “And already, [the policy] feels so outdated.”

In August, the Pratt Institute released a statement cautiously laying ground rules for faculty and students, stating the tools can help with “creative use as much as misuse.” AI use will be a faculty decision for the time being.

“However we’re using these tools, we want to use them creatively,” said Chris Alen Sula, Pratt’s associate provost for academic affairs. “And we’re also thinking about critical dimensions of AI. Questions like bias and training sets, how it’s changing labor—lots of questions students have about jobs and what’s happening with industries.”

“I think we want to offer flexibility to our faculty at the moment, at least during this experimental phase,” he said. “I think if we get to the point where this becomes an essential skill, it would be incorporated there in the curriculum, like other skills have been.”

A woman with face paint and multicolored objects in her hair rests her head on her hand

Griffin Smith has created over 70,000 images, including this one, with Midjourney

Photo illustration by Griffin Smith | Midjourney

Dakan, at Ringling, first mentioned forming policies around artificial intelligence in April. He was motivated after seeing the University of Florida’s plan to hire dozens of AI-focused faculty members across a multitude of disciplines beyond STEM-focused subjects.

“That’s what’s so compelling, is across UF, there is AI literacy in every program,” he said.

Dakan and a team of faculty members formed a plan on how to tackle AI—and they admit they used ChatGPT to help brainstorm. Ringling assembled a task force, which worked throughout the summer and ultimately created a resource guide that was placed on the Ringling library website. The task force also created and dispersed a survey for students and faculty to gauge their thoughts.

The survey, released in August, revealed 70 percent of students felt “somewhat” or “extremely” negative toward AI, with a majority of students stating they did not wish to include it in the curriculum. Dakan is now planning to host open forums to further hear students’ thoughts.

“There is that resistance among the current generation here, and I do think it’s important to really listen to, and honor, that sorrow,” Dakan said. He cited student concerns with copyright, fair use and ethical conundrums. “I think we rushed ahead to figure out what the administration and faculty response would be, and now we’re playing a bit of catch-up involving students as well.”

AI in Art Class

Dakan said Ringling’s classes are attempting to incorporate AI into their teachings as a tool rather than a replacement. For example, a costume design course at Ringling uses AI to draw inspiration before the students use their own skills to create the actual costumes.

At RISD, Smith said the interest in the technology surpassed what he typically would only see among his computational tech and culture students.

“The niche kids who wanted to learn how to code aren’t the ones banging on our doors anymore; it’s everyone,” he said. “Now the animators are considering it, the industrial architects. Every department at RISD is waiting for the shoe to drop.”

Smith added he is urging the institution to add more broad course offerings but is hitting some roadblocks.

“I’m telling everyone who will listen, we need broader course offerings, but these things move slowly,” he said. “There’s a lot of fear and skepticism around these tools.”

Pratt does not have a single AI-focused course, but it did introduce a machine learning course in 2020. Alen Sula said other courses have “underpinnings” of AI, including in undergraduate coding courses and an undergraduate course called Digital Tools for Artists.

“I think as soon as Photoshop became ubiquitous, faculty have been having conversations with students about Photoshop, and using it as a tool, and the difference between making all of your work on Photoshop and printing it out onto a piece of paper or projecting it onto the wall,” Smith said. “All of those conversations, I think, are very easily transferable to AI.”

Ringling’s Dakan is teaching Writing With AI this semester and will teach Fundamentals of AI next semester. However, he said he believes AI courses will never be required to graduate and instead need to be incorporated into the other courses.

“Having a course is useful, but those are fundamental skills in the different disciplines and they have to be baked in,” he said. “I don’t see our curriculum—or anyone—saying, ‘You need freshman comp and freshman AI.’ Just like we don’t have a class in Photoshop here, or Word.”

And he and others encourage faculty to use the tools—not just tinker around, he said, but “actually use them” and reconsider the current assignments and how they could be impacted by artificial intelligence. In Ringling’s recent survey, roughly one-third of faculty members said over half their assignments could be done with AI.

Ethical Understanding

Nick Montfort, a professor of digital media at the Massachusetts Institute of Technology, added that if the tools are being used, institutions should aim for a deeper overall understanding of the technology.

“The primary concern should be how can we provide new education, how can we inform students, artists, instructors,” he said. “How can we, not so much bolt policies or create guardrails, but how can we help them understand new technologies better and their implications?”

There is a concern from both students and faculty on the ethics behind how the AI systems are trained. Generative AI systems, such as DALL-E or Midjourney, are “closed” systems, meaning it is unknown how they’re trained. Montfort suggests open systems, including the UAE’s Technology Innovation Institute Falcon LLM 40B and Meta’s Code Llama.

“One of the things I always recommend for individual practitioners who are using generative AI in their own artistic practice is to use free and open systems,” he said. “These models are free, open, we know what they’re trained on; we have that data.”

There are plenty of questions remaining on the ethics of AI, specifically with its use of artwork to train its models. Those issues—namely surrounding copyright and fair use—are anticipated to be heard by the courts, but the discourse can be expected to rage on regardless of the legal outcome.

“I think we will continue to have [this conversation], and those are important pushbacks,” South said. “I think that comes back to the philosophical space of art as a way to really question things that have been kind of legally positioned, maybe with very different thoughts in mind.”



Source link