Abstract and 1. Introduction
Prior Work and 2.1 Educational Objectives of Learning Activities
2.2 Multiscale Design
2.3 Assessing Creative Visual Design
2.4 Learning Analytics and Dashboards
Research Artifact/Probe
3.1 Multiscale Design Environment
3.2 Integrating a Design Analytics Dashboard with the Multiscale Design Environment
Methodology and Context
4.1 Course Contexts
4.2 Instructor interviews
Findings
5.1 Gaining Insights and Informing Pedagogical Action
5.2 Support for Exploration, Understanding, and Validation of Analytics
5.3 Using Analytics for Assessment and Feedback
5.4 Analytics as a Potential Source of Self-Reflection for Students
Discussion + Implications: Contextualizing: Analytics to Support Design Education
6.1 Indexicality: Demonstrating Design Analytics by Linking to Instances
6.2 Supporting Assessment and Feedback in Design Courses through Multiscale Design Analytics
6.3 Limitations of Multiscale Design Analytics
Conclusion and References
A. Interview Questions
\
We present findings from our grounded theory qualitative analysis of instructor interviews, regarding their experiences with the research artifact / probe. We developed four categories through the analysis of interview data, which illustrate (1) analytics providing insights and informing pedagogical action, (2) support for exploration, understanding, and validation of analytics, (3) use of analytics for assessment and feedback, and (4) analytics supporting students’ self-reflection.
Instructors in our study reported that multiscale design analytics provide them with novel and useful insights. I6 compared the experience with learning management systems, such as Canvas and Blackboard. According to I6, the scale and cluster analytics offer unique insights, which they have not encountered on any other system. According to I1 and I5, the scale and cluster analytics help them understand students’ progress on their design projects. I2 finds the analytics particularly useful in understanding how students have developed and presented structure in their design.
\ I1: I think using the dashboard and using the analytics is really helpful for me to kind of get an understanding of what [students are] doing.
\ I2: I’ve been thinking like, you know, [scales and clusters] could be a very useful information for me, you know in terms of how students develop structure and present that structure at different levels.
\ We find initial evidence for the value of insights—provided by multiscale design analytics—as a basis for pedagogical intervention [50, 68]. I9 expressed that these analytics can help them find out whether students are able to effectively use the multiscale design environment, and take action, adjusting the curriculum, as needed.
\ I9: If there are multiple scales and clusters…they are at least using the environment efficiently. So if this number is extremely low for everybody…then maybe you need to [give] a tutorial on [the design environment].
Aiding comprehensibility is an important challenge to address in developing user interfaces for AI-based technologies [67]. AI-based multiscale design analytics represent complex characteristics of students’ design work. Instructors found that our dashboard design helps them explore and understand multiscale design analytics. In particular, instructors expressed that the links on the dashboard, in conjunction with visual annotations about how the algorithm operated on design instances, help them explore and understand the relationships between analytics and the scales and clusters they represent. I1 and I5 further expressed desire to navigate to specific scales and clusters within a design.
\ I1: I’m really enjoying these links that I can kind of click on it…with the scales or clusters like they can take me to those. I was wondering…whether it would be possible to…maybe like pinpoint or just kind of go to the precise scale.
\ I9: I was able to infer…there is one zoom level that has a particular region…and then they have a different zoom level that focuses on a different region and so on.
\ Linking the analytics with design assemblages that they measure supported instructors in giving feedback to validate and refine what is measured. As instructors were able to inspect specific regions represented by analytics, they expressed where AI has a mismatch with their interpretation.
\ I3: “I’m not sure why [it shows here] two different ones…you’ve got a couple [extra] clusters.”
\ I1, I3, I4, and I9 derived assistance in understanding the analytics through our dashboard’s visual annotation of the designs, which concretely represent the analytics through an animation of scales and clusters. The animation—which presents clusters present at each scale one by one—helped instructors understand how design elements form spatial clusters across scales. As I1 expressed*, “I think I now have a better understanding of spatial clusters [with] the animation of colors changing”.*
\ \
:::info Authors:
(1) Ajit Jain, Texas A&M University, USA; Current affiliation: Audigent;
(2) Andruid Kerne, Texas A&M University, USA; Current affiliation: University of Illinois Chicago;
(3) Nic Lupfer, Texas A&M University, USA; Current affiliation: Mapware;
(4) Gabriel Britain, Texas A&M University, USA; Current affiliation: Microsoft;
(5) Aaron Perrine, Texas A&M University, USA;
(6) Yoonsuck Choe, Texas A&M University, USA;
(7) John Keyser, Texas A&M University, USA;
(8) Ruihong Huang, Texas A&M University, USA;
(9) Jinsil Seo, Texas A&M University, USA;
(10) Annie Sungkajun, Illinois State University, USA;
(11) Robert Lightfoot, Texas A&M University, USA;
(12) Timothy McGuire, Texas A&M University, USA.
:::
:::info This paper is available on arxiv under CC by 4.0 Deed (Attribution 4.0 International) license.
:::
\


