Friday, June 28, 2013

Tuition lists released

The Department of Ed has released its <a href="http://collegecost.ed.gov/catc/#">annual lists of the colleges with the highest/lowest tuition and net prices</a>, and a good summary is available <a href="http://www.insidehighered.com/news/2013/06/28/education-department-releases-annual-tuition-pricing-lists">here</a>.

One nice thing about this list, as opposed to how "most expensive" data has been shown other places, is that in addition to listing the highest tuition as per the college catalog, it also lists net price, which takes into account financial aid. So, yes, Columbia University has the country's highest tuition, but it doesn't appear on the highest net price list.

But I noticed one particular finding: Among private, not-for-profit colleges, the music and art schools are way expensive. The top five for net price (and seven of the top ten) are art or music. Arts instruction is expensive because there is so much one-on-one teaching. Yet - what kind of earning potential do their graduates have? In current discussions of the cost of college, the job market, and both institutional and individual accountability, this is one sector that hasn't received much criticism. If we're going to discuss whether it's smart to get a philosophy B.A., we should also discuss whether a B.F.A is a smart move.

Wednesday, June 26, 2013

Nate Silver's data analyses have become popular for a reason, and here's a new one I like on the "decline" of liberal arts.

In my structure of higher education course, we discuss the putative decline of liberal arts college, liberal arts majors, and the liberal arts in general - all three separate issues that tend to be conflated, even by the scholars studying them. Silver is focused on the second of those, the number of students majoring in the liberal arts, and he succinctly points out that about the same number of young people are going to college and getting English degrees as 40 years ago. It's just that more people are going to college, and we have pretty good reason to believe they are a different population of students. He doesn't unpack that notion very much, but expanding access and intensifying credentialing induces students who might not otherwise go to college to go, and they make different sets of choices. Students aren't converting away from English; we're gaining new vocational students.

I'm going to add this piece to reading list for structure for the fall semester.

Friday, May 3, 2013

Data at AERA

At AERA I attended a range of presentations, using data and analytic techniques from the historical to the quantitative. Many of the presenters struggled with shortcomings in their data, or at least limitations. One paper used the average wealth in zip codes to approximate student family income - problematic in some ways, but a commonly used tactic. Other data issues proved more intractable. For example, Brian Bridges used some data from an ACE survey on presidents to examine HBCU presidents, but the data wasn't broken down by public/private institutional control. Another paper used a survey conducted with the development office to predict young alumni giving, but the researcher wasn't allowed to ask about income - surely an important predictor. Of course, the alternative would have been no data at all. One takeaway I was reminded of was that people who work for higher ed organizations have some of the best access to pre-existing datasets, but not necessarily the time to mine them on their own - so this is a great place for scholars looking to collaborate.

Saturday, March 16, 2013

Data tracking and display for personal motivation

Here is a more personal case of data display - an extremely low-tech management information system of sorts.

I have a lot of grading and no teaching assistant; in a typical week, I need to grade about nine sets of assignments. Last semester, I fell behind, so this semester I was determined to stay on top of things.

I needed a system that would both track my grading, provide me with a reward, and allow others to monitor my progress. Not that anyone in my department is watching over my shoulder, but the fact that they could is an incentive.

Thus, the Peeps calendar with Lego Star Wars stickers. Every assignment I grade, I get a sticker. Yes, motivating me is much like motivating a toddler. As you can see here, I try to reserve grading for Monday through Friday. I was doing all right until the week of Feb. 17-23, when I was laid low by a nasty cold and it was all I could do to drag myself to class.

After a few weeks, I added to the system with a small white board that lists assignments that are turned in but not yet graded. It handily beats the system of sticky notes I was using!

Although this seems like a rather minor application of data display (and certainly an inelegant one), it has provided me with some useful feedback. I have found myself getting slightly behind in March, for example, primarily because advising takes up much of my time in the period before fall registration (as you can see on the white board). Knowing that allows me to decide how to tackle the problem before it gets out of hand, and in time to give students realistic midterm grades. Also, I get stickers.

Tuesday, March 12, 2013

This blog is mentioned over at Paper Raven Editing in a post about academic blogging. Paper Raven offers editing services, and the brain behind it runs a great blog with advice about academic writing.

Friday, March 1, 2013

Presenting network analysis

The Chronicle of Higher Education; ran an article on how colleges select peers, both actual and aspirational, and with it they ran an interactive graphic showing what colleges select as their actual peers. (May be behind a paywall.) The data comes from always-popular IPEDS;, but the Chronicle put a fair amount of work into creating a more digestable presentation using Gephi, an open-source tool I have not yet tried.

I use the results in my Structure of Higher Education class. Students explored the data on their own, and we looked at it in class. In general, students found the graphic intuitive to use, although there were some technical issues: Some students reported that the data didn't load, or loaded very slowly. I found that it wouldn't load at all on my work PC, running college-mandated IE8 - but it worked, much to my surprise, on my iPhone. (At least I received the courtesy of an error telling me what browsers would work.)

The resulting graphic shows some things that tables couldn't easily. For example, some institutions cluster tightly, choosing each other as peers, whereas institutions do not. That's immediately visible by clicking on any institution, represented by a dot, and seeing whether the other dots that light up are near or far from it.

One minor distraction was that the entire graphic was not visible in one window. I didn't expect this on my iPhone, naturally, but even on a traditional monitor, only part of the data was visible at once, even when zoomed out. Ideally, the zoom level would allow it to all be seen at once. One potential fix would be to move institutional data from the right to the top or bottom, given the landscape orientation of the graphic. There is actually relatively little of what Tufte calls "chartjunk" taking up valuable screen space - the browser itself uses some space, but this is out of the Chronicle's control.

Overall, it's a relatively sophisticated presentation of data that really adds value above what a strictly tabular format would.

Monday, February 11, 2013

Higher education trends

<a href="http://theturducken.blogspot.com/2013/02/its-credential-not-education.html">A post</a> on my more personal blog on how to think about MOOCs.

Friday, January 11, 2013

Known unknowns

When I was working on my dissertation, one question I tried to find an answer to was, "What proportion of board members at private colleges donate in year [x]?" This seems like a straightforward question - it's inherently quantitative, after all. Moreover, it's something frequently discussed in the nonprofit and education sectors. The phrase "give, get, or get off" is used to describe what board members ought to do when it comes to making donations. So, you might imagine someone has researched it - but you, and I, would be wrong.

The question breaks down into two parts. One, how many board members did you have on the board of Your College last year? Two, how many of them donated during that fiscal year?

IPEDS, the ultimate data source for researchers in this field, doesn't have the answer to either one. (What makes IPEDS so great? It's a survey that is conducted every year, so it's up to date, and every institution that gets federal funding must participate. That includes indirect funding, such as federal grants/loans to students, so it means nearly every institution in the country.) Various surveys do ask how many board members a college has, but none of them are conducted every year, nor are any of them comprehensive. Many college websites provide enough trustee data to at least find out how many there are, but not all.

The second question, about how many gave, isn't asked at all. I won't list every data source that can't help us, but I will mention one: The Voluntary Support of Education Survey, one of my personal go-to sources, doesn't tell us, either. It asks for the number of board gifts made in a year, which isn't the same thing: It includes gifts from emeritus and honorary members, who are non-voting. Besides, if a single donor gave twice during the year, that would show up as two gifts. We want to count donors, not donations.

But say you don't care about board giving - I use this example as an illustration. There are lots of data problems like this in education research. No one with serious funding has decided they are important enough to throw major resources at answering them, but they are important enough to practitioners that anecdotal evidence abounds in white papers and consulting firms.

Sunday, January 6, 2013

Mapping U.S. colleges and universities

I teach a course at CUNY Baruch on the structure of U.S. higher education, and a large part of what I try to do in that class is to encourage students to be able to segment higher education in various ways: How are research universities different from small liberal arts colleges? How are public institutions different from private, not-for-profit institutions?

One of the tools I use to help them do this is a Google Fusion map:

In the screen shot above, all institutions in the continental states and D.C. are shown (with the exception of Hillsdale College). Red dots are two-year institutions; blue dots are four-year institutions. Even in this view we can notice at least one pattern: Heavily populated areas have more institutions than more sparsely populated areas.

By clicking on "filter," users can view a subset of institutions, such as HBCUs, colleges with enrollment of 1,000-1,500 full-time students, or private colleges in California.

The map was created in Google Fusion using IPEDS data. I chose Google Fusion because I have no experience with GIS and it is free. However, it has some serious drawbacks that make it much less user-friendly than I would like; ultimately, this means my students don't take full advantage of it.

Fusion is often non-intuitive (for example, figuring out how to upload a CSV file and convince Google to use it as a Fusion table rather than placing it in your Drive) and has surprising limitations (a column can't be deleted if the table is the result of merged tables). That is less problematic than the difficulties it presents to the end users.

One challenge is that that what users see is exactly what the table editors see. I can't hide the spreadsheet from them, for example. Students have a fair amount of difficulty learning to operate the filters; there are too many irrelevant menus (e.g., File, Edit, Tools). They don't need to know how the sausage is made - they just need a sausage and a fork.

The larger problem is that the presentation of data is surprisingly limited. My biggest wish is for students to show different types of institutions in different colors. It would be very instructive, for example, to show all private and public institutions in different colors, and then turn that option off when you are examining something else. However, Fusion doesn't allow icons to change on the fly, so I settled on two-versus-four years as the default. But it makes it much harder to visually see the proportion of privates versus publics if one has to toggle back and forth between the two, or (worse) mouse over every individual institution.

Ultimately, the map is not entirely successful as a pedagogical tool. For a savvy user, it could be useful for the creation of graphics. Fusion seems to be a case study in forgetting who the most audience is for a software product - it should not be for the developers, but the users of what the developers produce.