Dice Question Streamline Icon: https://streamlinehq.com

Evaluation of creativity support tools

Develop and establish standardized, widely accepted methodologies for evaluating creativity support tools that (i) operationalize the ambiguous and multifaceted nature of creativity, (ii) account for diverse and sometimes contradictory user needs across creative contexts, and (iii) specify which aspects of creativity support tools should be evaluated.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper reviews process- and product-focused approaches to evaluating creativity support tools and notes persistent difficulties stemming from the ambiguity of creativity, varied user needs across domains, and lack of consensus on what to measure. These challenges complicate comparisons across tools and hinder cumulative progress in the field.

In their paper, the authors combine product metrics (e.g., homogenization via semantic similarity, TTCT-derived fluency/flexibility/elaboration) with process measures (e.g., CSI and responsibility judgments), illustrating a mixed approach while acknowledging the broader evaluation problem remains unresolved.

References

Evaluation of CSTs has remained an open problem since essentially the beginning of CST research, due in part to the ambiguous and multifaceted nature of "creativity" as a phenomenon, in part to the wide range of (sometimes contradictory) user needs associated with different creative contexts, and in part to the lack of a clear consensus around what aspects of CSTs should be evaluated.

Homogenization Effects of Large Language Models on Human Creative Ideation (2402.01536 - Anderson et al., 2 Feb 2024) in Section 2.3 (Evaluating Creativity and CSTs)