What is Penn State planning to do with learning analytics? As a faculty member at Penn State, I'm interested in this question, so I've tried to explore this question over the past few months.
In April, several individuals from across Penn State's campuses virtually attended the EDUCAUSE Learning Initiative event on learning analytics.
Also in April, a presentation by Chris Millet, Simon Hooper, and Bart Pursel, individuals from various educational technology groups at the University, was held at University Park. This presentation offered to the campus community a background on learning analytics and outlined current work being done in learning analytics, which includes:
--Early Progress Report: this is a low-level use of LA which is nonetheless helpful to students and faculty as it automates the process of alerting faculty and academic advisers of students who are receiving a C or less in any course as of the third week of the semester. Not only is the faculty alert automated, but students also receive an email message alerting them to their own progres.
--examination of the relationship between blog and wiki posting in the learning management system with course GPA. Do students who are more connected and communicative with course colleagues fare better in terms of grades?
--Simon Hooper is helping faculty design better multiple choice tests by analyzing student performance on discrete test questions and comparing it to overall GPA and performance on other assignments involving specific learning objectives. In doing so, "bad" test questions -- those that don't discriminate well between those who have mastered a specific learning objective and those who haven't -- can be eliminated or redesigned.
Then in May I met with Chris Millet from Teaching and Learning with Technology, Penn State's educational technologies group. He had just returned from LAK12, the Second Annual Conference on Learning Analytics and Knowledge and he graciously shared with me some of what he learned at that conference. Millet also described some of the work of a recently-formed Learning Analytics group that has been charged by Penn State Provost Robert Pangborn to explore and implement the use of learning analytics at the University.
There still is a lot of work to be done in developing capacity in this area at the University. The choice of a new learning management system to replace ANGEL, our current LMS, will also impact the adoption of learning analytics. Many LMSs now have learning analytics components built in. Unfortunately any University-wide implementation of learning analytics will be hampered by the College of Medicine's choice to adopt Moodle, which is one of the LA systems that apparently is not being considered by groups elsewhere at the University. Furthermore, according to Millet, only about 75% of faculty across PSU have even adopted ANGEL, the current LMS. Will the numbers improve with a new LMS? Perhaps the 25% of faculty who have not adopted ANGEL have good pedagogical reasons for not doing so -- maybe they're using technology in other ways.
I noted with interest this quote from Chris Millet concerning data sources for LA: "The analysis of this data, coming from a variety of sources like the LMS, the library, and the student information system, helps us observe and understand learning behaviors in order to enable appropriate interventions.” Again, mention of the library as a source of data. Yet I wonder how many librarians know about learning analytics and are currently considering how libraries might be involved?