Generative AI’s Role in Transforming Cultural and Creative Industries: New Challenges and Opportunities

The paper, “Generative AI, Work, and Risks in Cultural and Creative Industries,” by Emmanuelle Walkowiak and Jason Potts, presents an innovative framework that evaluates the risks and transformative impact of Generative AI (GenAI) on creative industries. By categorizing roles within cultural industries into various “job transformation zones” based on GenAI exposure and associated risks, they lay the foundation for understanding AI’s impact on creativity. This article expands on their findings, incorporating insights from recent research to examine the nuanced and often conflicting effects of GenAI on creative roles, intellectual property, worker mental health, and bias in content generation.


 

GenAI’s Transformative Impact on Creative Processes

AI serves as an enabler, facilitating more thoughtful, complex, and imaginative tasks.

Epstein et al. (2023) point out that GenAI fundamentally reshapes creative workflows by automating elements of production, from visual art to writing. This shift, observed across multiple creative sectors, complements Walkowiak and Potts’ findings: while GenAI poses a potential threat to traditional roles, it also enhances creative output by automating repetitive aspects of artistic tasks, allowing creatives to focus on high-level decision-making. In design industries, for instance, the focus moves from execution to curatorial oversight. Lyu et al. (2023) provide a compelling example from the jewelry industry, where designers now prioritize aesthetic decisions and emotional resonance while GenAI handles preliminary designs and technical execution.

This shift highlights a new model for creative work: AI serves as an enabler, facilitating more thoughtful, complex, and imaginative tasks by removing rote aspects of production. The challenge now lies in structuring creative workflows to leverage AI’s efficiency without compromising the originality and intuition that define human creativity. This redefined workflow suggests a need for adaptable, AI-integrated roles where creative professionals oversee GenAI outputs, curate final products, and retain control over the creative direction.

 

Intellectual Property and Copyright: A New Frontier

[There is] a growing need for updated legal frameworks that account for AI’s role in content creation.

The implications of GenAI on intellectual property (IP) and copyright are profound. Shumakova et al. (2023) highlight the frequent repurposing of existing content by AI models, raising concerns over potential infringement on creators’ rights. GenAI’s capacity to generate work that mirrors or directly incorporates elements from existing copyrighted content presents a significant challenge to the traditional concept of originality. Walkowiak and Potts similarly acknowledge these risks, particularly in consumer-facing creative sectors where copyright laws are foundational. The resulting disruption underscores a growing need for updated legal frameworks that account for AI’s role in content creation.

Revised copyright laws could incorporate specific provisions for AI-generated content, stipulating clear guidelines on ownership, usage, and attribution. Establishing a legal mechanism for GenAI content would safeguard the rights of original creators and help set boundaries for AI-generated work. These frameworks would be crucial in industries such as film, music, and design, where creators rely heavily on IP protections to secure their work’s originality. Additionally, expanding copyright laws to include AI-driven content licensing and fair use could provide a viable pathway to balancing innovation with the rights of traditional creators.

 

The Psychosocial Impact on Workers: Addressing “Creative Displacement Anxiety”


As GenAI takes on a larger role within creative industries, there’s a growing psychological impact on workers. Caporusso (2023) introduces “Creative Displacement Anxiety” — a phenomenon describing the mental health risks that arise from the uncertainty and perceived threat of AI replacing creative roles. This concept dovetails with Walkowiak and Potts’ exploration of mental health risks for cultural workers, particularly in fields where job displacement by AI is perceived as imminent. In addition to the economic implications, the anxiety stemming from potential obsolescence poses challenges for sustaining a healthy creative workforce.

To address this anxiety, organizations could introduce mental health support initiatives, such as counseling and AI literacy programs, to help workers understand AI’s role as a complement rather than a replacement. Promoting a view of AI as a collaborative tool rather than a competitive force could alleviate some stress, allowing creatives to see GenAI as enhancing their capabilities rather than threatening their roles. Upskilling programs that empower creative professionals to interact with and guide AI systems could also mitigate fears by fostering a sense of control and ownership over AI-assisted creative processes.

 

Bias and Misinformation in AI-Generated Content

Unchecked biases in AI outputs could shape societal narratives in unintended ways.

The potential for bias in AI-generated content is a pressing issue, particularly in creative industries that shape public opinion and cultural narratives. Tredinnick and Laybats (2023) argue that GenAI’s black-box nature often results in outputs that reinforce pre-existing biases or stereotypes present in training data, creating risks for perpetuating cultural biases. This finding aligns with Walkowiak and Potts’ analysis, which emphasizes the need for robust oversight to address misinformation and bias risks inherent in AI-generated content.

Given the cultural influence of creative industries, unchecked biases in AI outputs could shape societal narratives in unintended ways. To mitigate this risk, creative organizations might consider ethical guidelines and training data diversity standards to ensure GenAI models produce balanced representations. Expanding oversight mechanisms, such as AI content review boards, would help maintain accountability in high-stakes creative outputs, enabling organizations to identify and rectify biased outputs before they reach audiences. An emphasis on diverse, representative data in GenAI training could support a broader range of perspectives, helping AI outputs reflect a more inclusive cultural landscape.

 

A Tailored Approach to AI Risk Management in Creative Roles

Walkowiak and Potts’ categorization of roles within “job transformation zones” offers a powerful tool for understanding varying levels of GenAI exposure across different creative tasks. Their task-based framework echoes Thibault et al. (2023), who argue for customized risk management strategies based on the intensity and type of GenAI application within specific creative roles. Recognizing that creative industries are highly varied, with unique challenges in different fields, the framework underscores the need for tailored upskilling strategies that account for the specific risks GenAI presents in each role.

For instance, journalists facing misinformation risks might require training in digital literacy and fact-checking methodologies, while graphic designers may benefit more from IP and copyright protection training. This targeted approach ensures that workers develop relevant skills aligned with the particular GenAI risks they face. Establishing industry-specific training programs that address AI risks and ethical considerations for each creative sector could prepare professionals for safe, effective collaboration with GenAI systems.

 

Toward a Balanced Integration of GenAI in Cultural and Creative Industries

In aligning GenAI’s capabilities with a framework of accountability and ethical standards, the creative industries can establish AI as a tool for innovation rather than disruption.

Walkowiak and Potts’ research highlights a pivotal moment for creative industries. As GenAI’s role expands, a careful balance must be struck between innovation and risk management to support sustainable growth in creative fields. Moving forward, the cultural sector has an opportunity to redefine creative processes, ensuring that AI integration prioritizes both productivity and ethical integrity. By addressing GenAI’s potential and risks through comprehensive policy, industry-specific guidelines, and targeted skill development, cultural industries can lead by example in building an AI-enhanced yet responsible creative ecosystem.

In aligning GenAI’s capabilities with a framework of accountability and ethical standards, the creative industries can establish AI as a tool for innovation rather than disruption. This responsible integration would not only safeguard the roles of creatives but also uphold the values of originality, inclusivity, and cultural diversity that define the sector. As we embrace GenAI’s transformative potential, a proactive approach that combines regulatory foresight, adaptive risk management, and support for mental well-being will be essential in fostering a creative future where AI and human ingenuity thrive side by side.


 

Dr. Emmanuelle Walkowiak, Vice-Chancellor’s Senior Research Fellow at RMIT and research affiliate with the ARC Centre of Excellence for Automated Decision-Making and Society, brings 20 years of expertise in examining how technology transforms work and inclusivity. Her research focuses on digital transformation’s impact on workplace quality, neurodiversity, AI inclusivity, and blockchain governance. She leads the FLOW-GenAI initiative, exploring the evolving nature of work with Generative AI. Dr. Walkowiak earned her PhD in Economics from University Paris IX-Dauphine and has conducted research across the US, UK, and Europe.


Dr. Jason Potts is an economist renowned for his work in evolutionary economics and complex systems theory. Based at RMIT University, he is the Director of the Blockchain Innovation Hub and a leading researcher on blockchain economics and the role of creative industries in innovation-led growth. His concept of the “innovation commons,” inspired by Elinor Ostrom, has expanded the field’s understanding of shared innovation resources. Dr. Potts holds a B.Com (Hons) from the University of Otago and a PhD in Economics from Lincoln University, New Zealand.


Reference & Interesting Reads

  • Epstein, Z., Hertzmann, A., Herman, L., Mahari, R., Frank, M., Groh, M., Schroeder, H., Smith, A., Akten, M., Fjeld, J., Farid, H., Leach, N., Pentland, A., & Russakovsky, O. (2023). Art and the science of generative AI. Science, 380, 1110–1111. https://doi.org/10.1126/science.adh4451

  • Banks, M., Lovatt, A., O’Connor, J., & Raffo, C. (2000). Risk and trust in the cultural industries. Geoforum, 31, 453–464. https://doi.org/10.1016/S0016-7185(00)00008-7

  • Shumakova, N. I., Lloyd, J. J., & Titova, E. V. (2023). Towards Legal Regulations of Generative AI in the Creative Industry. Journal of Digital Technologies and Law. https://doi.org/10.21202/jdtl.2023.38

  • Orchard, T., & Tasiemski, L. (2023). The rise of Generative AI and possible effects on the economy. Economics and Business Review, 9, 9–26. https://doi.org/10.18559/ebr.2023.2.732

  • Anantrasirichai, N., & Bull, D. (2020). Artificial intelligence in the creative industries: a review. Artificial Intelligence Review, 55, 589–656. https://doi.org/10.1007/s10462-021-10039-7

  • Lyu, Y., Shi, M., Zhang, Y., & Lin, R. (2023). From Image to Imagination: Exploring the Impact of Generative AI on Cultural Translation in Jewelry Design. Sustainability. https://doi.org/10.3390/su16010065

  • Tredinnick, L., & Laybats, C. (2023). Black-box creativity and generative artificial intelligence. Business Information Review, 40, 98–102. https://doi.org/10.1177/02663821231195131

  • Thibault, M., Kivikangas, T., Roihankorpi, R., Pohjola, P., & Aho, M. (2023). Who am AI?: Mapping Generative AI Impact and Transformative Potential in Creative Ecosystem. Proceedings of the 26th International Academic Mindtrek Conference. https://doi.org/10.1145/3616961.3617804

  • Woodruff, A., Shelby, R., Kelley, P. G., Rousso-Schindler, S., Smith-Loud, J., & Wilcox, L. (2023). How Knowledge Workers Think Generative AI Will (Not) Transform Their Industries. ArXiv. https://doi.org/10.48550/arXiv.2310.06778

  • Caporusso, N. (2023). Generative Artificial Intelligence and the Emergence of Creative Displacement Anxiety. Research Directs in Psychology and Behavior. https://doi.org/10.53520/rdpb2023.10795

Photo by Diego PH on Unsplash





Next
Next

The Illusion of Data-Driven Decisions: Rethinking A/B Testing in Business