Tuesday, January 22, 2008

Split-Personality Institutions

Universities are facing a tough time. Are they for extending knowledge? Or for training people up so that they can get jobs in an "advanced" economy? The former harks back to the original definition of "academic", while the latter juggles with the idea of "practicality". But there are compromises to be made, especially when a sense of economic efficiency watches over the two of them. Computer "Science" is no exception, as has been seen/highlighted in recent discussion over the suitability of teaching Java to students.

But I think this tug-of-war goes a lot further than education. The Ideal Government blog has a snippet concerning the IPS's indecision over what the National Identity Register should, in fact, be.

This "split-personality" for such services is interesting. Does it come from an agglomeration of functions? (Evolution = economy of scope in an increasingly genericised world?) How about a re-definition of the role a particular institution now finds itself in? (Research now becomes more practical than academic, as progress becomes more and more important?)

Or are we reaching a "critical mass", a tearing between scales? Is the system more important than the user? Can a user exist without the system? And what, after all this, is the correct course of action?

2 comments:

Richard Veryard said...

In the past, organizations set their own agenda. Universities were run by academics, so they set themselves an academic agenda. There were enough students willing to submit themselves to this agenda for 3-4 years, enough employers willing to take a degree in Ancient Greek or Theoretical Physics as a reasonable token of intelligence, and above all a Government (run by people who had mostly gone through the process themselves) willing to fund the whole thing without asking too many questions.

The employers that really understand software engineering always knew that someone with a good degree in Greek or Physics can learn Java in much less time than it takes a computer science graduate to learn to document code properly.

In any case, any technology you learn at university could be obsolete before you complete your degree, and mostly forgotten by the time you are three months into your first job working on something completely different. The real purpose of a computer science degree is hopelessly muddled, in my view.

Confusion of purpose is linked to the fact that there are lots of stakeholders, some of whom have a very limited or out-of-date or self-interested appreciation of how things actually work.

In the example of national identity, there seems to be an even larger gulf between the people who are taking the decisions and the people (such as Light Blue Touchpaper) who actually seem to understand the implications of these decisions.

To my mind, these split-personality institutions (and programmes) are a manifestation of some pretty deep political problems, with no easy solutions. I certainly don't advocate a return to the days of expert-knows-best, but we seem to have replaced it with a dysfunctional process that still retains a massive democratic deficit.

Scribe said...

Yup. I think the problem arises in the same way a lot of British problems do: the govt tries to "catch up" with other countries on a particular indicator, without realising that that indicator is an outcome, not a cause. So Universities get judged - and funded - according to the RAE, making it more important to cater to the RAE than to what the RAE is supposed to measure. Oops.

The transition Computer Science is a single example of a greater, but similar problem: boosting the economy. Java is great from a company's perspective - documented, standardized and modular code makes it easier to replace workers with cheaper workers. So getting a job requires learning Java, but the training is so "standard" that simply going to Uni no longer becomes anything "special" - it's just a place they teach Java. Actually learning specialized knowledge is no good for industry, as it lets an individual charge more money... (The other outcome is that only the insightful actually get specialized jobs - maybe that's a good thing? ;)

Equating economy with progress, as the govt is doing, is doomed to fail because a) you need to spend more money to keep people up with that progress, and b) progress is inherently "specialized", which is bad for cheap labour.

Disclaimer: I did CS at Uni, and learnt Java for a large part of it. As of this year, however, I'm actually finding it much more useful to remember the "S" bits, such as algorithmic design and information access, than the Java side of things. Fortunately we were also taught fairly generic ideas...