Friday, June 28, 2013
One nice thing about this list, as opposed to how "most expensive" data has been shown other places, is that in addition to listing the highest tuition as per the college catalog, it also lists net price, which takes into account financial aid. So, yes, Columbia University has the country's highest tuition, but it doesn't appear on the highest net price list.
But I noticed one particular finding: Among private, not-for-profit colleges, the music and art schools are way expensive. The top five for net price (and seven of the top ten) are art or music. Arts instruction is expensive because there is so much one-on-one teaching. Yet - what kind of earning potential do their graduates have? In current discussions of the cost of college, the job market, and both institutional and individual accountability, this is one sector that hasn't received much criticism. If we're going to discuss whether it's smart to get a philosophy B.A., we should also discuss whether a B.F.A is a smart move.
Wednesday, June 26, 2013
In my structure of higher education course, we discuss the putative decline of liberal arts college, liberal arts majors, and the liberal arts in general - all three separate issues that tend to be conflated, even by the scholars studying them. Silver is focused on the second of those, the number of students majoring in the liberal arts, and he succinctly points out that about the same number of young people are going to college and getting English degrees as 40 years ago. It's just that more people are going to college, and we have pretty good reason to believe they are a different population of students. He doesn't unpack that notion very much, but expanding access and intensifying credentialing induces students who might not otherwise go to college to go, and they make different sets of choices. Students aren't converting away from English; we're gaining new vocational students.
I'm going to add this piece to reading list for structure for the fall semester.
Friday, May 3, 2013
Saturday, March 16, 2013
I have a lot of grading and no teaching assistant; in a typical week, I need to grade about nine sets of assignments. Last semester, I fell behind, so this semester I was determined to stay on top of things.
I needed a system that would both track my grading, provide me with a reward, and allow others to monitor my progress. Not that anyone in my department is watching over my shoulder, but the fact that they could is an incentive.
Thus, the Peeps calendar with Lego Star Wars stickers. Every assignment I grade, I get a sticker. Yes, motivating me is much like motivating a toddler. As you can see here, I try to reserve grading for Monday through Friday. I was doing all right until the week of Feb. 17-23, when I was laid low by a nasty cold and it was all I could do to drag myself to class.
Although this seems like a rather minor application of data display (and certainly an inelegant one), it has provided me with some useful feedback. I have found myself getting slightly behind in March, for example, primarily because advising takes up much of my time in the period before fall registration (as you can see on the white board). Knowing that allows me to decide how to tackle the problem before it gets out of hand, and in time to give students realistic midterm grades. Also, I get stickers.
Tuesday, March 12, 2013
Friday, March 1, 2013
I use the results in my Structure of Higher Education class. Students explored the data on their own, and we looked at it in class. In general, students found the graphic intuitive to use, although there were some technical issues: Some students reported that the data didn't load, or loaded very slowly. I found that it wouldn't load at all on my work PC, running college-mandated IE8 - but it worked, much to my surprise, on my iPhone. (At least I received the courtesy of an error telling me what browsers would work.)
The resulting graphic shows some things that tables couldn't easily. For example, some institutions cluster tightly, choosing each other as peers, whereas institutions do not. That's immediately visible by clicking on any institution, represented by a dot, and seeing whether the other dots that light up are near or far from it.
One minor distraction was that the entire graphic was not visible in one window. I didn't expect this on my iPhone, naturally, but even on a traditional monitor, only part of the data was visible at once, even when zoomed out. Ideally, the zoom level would allow it to all be seen at once. One potential fix would be to move institutional data from the right to the top or bottom, given the landscape orientation of the graphic. There is actually relatively little of what Tufte calls "chartjunk" taking up valuable screen space - the browser itself uses some space, but this is out of the Chronicle's control.
Overall, it's a relatively sophisticated presentation of data that really adds value above what a strictly tabular format would.
Monday, February 11, 2013
Friday, January 11, 2013
The question breaks down into two parts. One, how many board members did you have on the board of Your College last year? Two, how many of them donated during that fiscal year?
IPEDS, the ultimate data source for researchers in this field, doesn't have the answer to either one. (What makes IPEDS so great? It's a survey that is conducted every year, so it's up to date, and every institution that gets federal funding must participate. That includes indirect funding, such as federal grants/loans to students, so it means nearly every institution in the country.) Various surveys do ask how many board members a college has, but none of them are conducted every year, nor are any of them comprehensive. Many college websites provide enough trustee data to at least find out how many there are, but not all.
The second question, about how many gave, isn't asked at all. I won't list every data source that can't help us, but I will mention one: The Voluntary Support of Education Survey, one of my personal go-to sources, doesn't tell us, either. It asks for the number of board gifts made in a year, which isn't the same thing: It includes gifts from emeritus and honorary members, who are non-voting. Besides, if a single donor gave twice during the year, that would show up as two gifts. We want to count donors, not donations.
But say you don't care about board giving - I use this example as an illustration. There are lots of data problems like this in education research. No one with serious funding has decided they are important enough to throw major resources at answering them, but they are important enough to practitioners that anecdotal evidence abounds in white papers and consulting firms.
Sunday, January 6, 2013
I teach a course at CUNY Baruch on the structure of U.S. higher education, and a large part of what I try to do in that class is to encourage students to be able to segment higher education in various ways: How are research universities different from small liberal arts colleges? How are public institutions different from private, not-for-profit institutions?
One of the tools I use to help them do this is a Google Fusion map:
In the screen shot above, all institutions in the continental states and D.C. are shown (with the exception of Hillsdale College). Red dots are two-year institutions; blue dots are four-year institutions. Even in this view we can notice at least one pattern: Heavily populated areas have more institutions than more sparsely populated areas.
By clicking on "filter," users can view a subset of institutions, such as HBCUs, colleges with enrollment of 1,000-1,500 full-time students, or private colleges in California.
The map was created in Google Fusion using IPEDS data. I chose Google Fusion because I have no experience with GIS and it is free. However, it has some serious drawbacks that make it much less user-friendly than I would like; ultimately, this means my students don't take full advantage of it.
Fusion is often non-intuitive (for example, figuring out how to upload a CSV file and convince Google to use it as a Fusion table rather than placing it in your Drive) and has surprising limitations (a column can't be deleted if the table is the result of merged tables). That is less problematic than the difficulties it presents to the end users.
One challenge is that that what users see is exactly what the table editors see. I can't hide the spreadsheet from them, for example. Students have a fair amount of difficulty learning to operate the filters; there are too many irrelevant menus (e.g., File, Edit, Tools). They don't need to know how the sausage is made - they just need a sausage and a fork.
The larger problem is that the presentation of data is surprisingly limited. My biggest wish is for students to show different types of institutions in different colors. It would be very instructive, for example, to show all private and public institutions in different colors, and then turn that option off when you are examining something else. However, Fusion doesn't allow icons to change on the fly, so I settled on two-versus-four years as the default. But it makes it much harder to visually see the proportion of privates versus publics if one has to toggle back and forth between the two, or (worse) mouse over every individual institution.
Ultimately, the map is not entirely successful as a pedagogical tool. For a savvy user, it could be useful for the creation of graphics. Fusion seems to be a case study in forgetting who the most audience is for a software product - it should not be for the developers, but the users of what the developers produce.