Archive for the ‘science’ Category

If I were a cartoonist

July 1, 2011

From 2001... Plus ca change... (PS Click cartoon for slightly better image)

In which Dr Aust wishes he could draw, and muses on the changing appearance of the British “-ologist”.

A recent conversation with one of my twitter readers, postdoctoral researcher, occasional blogger and one-time co-worker dbaptista, chanced upon the topic of cartoons.

Dr Aust has always been fond of cartoons, and I have (another)  long backburner-ed book idea involving a compilation of scientific ones. Sadly, my favourite modern cartoonist, the inimitably black / bleak John Callahan, passed away last year, but his cartoons are still with us, and many remain all-time classics. Though I can’t find it online, a series he did on ‘The Hill of Evolution’ stand out for me. Perhaps I will post a couple here if I can find them in book form.

Callahan’s special gift was to offend pretty much everybody. He once quipped about what happened when he started cartoon-ing:

“Very shortly I was to be identified as a sexist, racist, ageist, fascist communist – in fact, I’m merely cartoonist”

As I said on twitter, if I’d actually been able to draw worth a damn, and had been better at thinking up funny lines, I might have fancied being a cartoonist.

Which explains, I guess, both my avatar, and the one cartoon that I have published. Though ‘published’ is probably  too grand a word; the magazine that printed it is a membership one for the Physiological Society, and the then editor was a friend of mine. And of course I didn’t get a fee.

But anyway, in response to dbaptista’s request, I dug it out of the archives – or rather, found it online – and have reproduced it above*. To my amazement, and even rather worryingly, it is a full ten years old.

Anyway, a few comments on the cartoon, starting with the top panel:

If you look back at photographs of scientists of the 1920s (a nice example can be found in the group photo here),  you will indeed find that tweed suits – usually three-piece ones, with a waistcoat to keep one’s shirt and tie away from the smellier or messier bits of the experiment – were pretty much de rigeur. Though for me personally, this panel was an hommage to Woody Allen, whose early movie Sleeper contained the immortal line:

‘Science is… guys in tweed suits cutting up frogs’

The 1970s scientist is more like the sort of people that I remember seeing around the Oxford University science areas when I was a teenager, and were still common in Universities when I was a student in the 80s (though many of them had trimmed their beards somewhat by then). In the Physiology Department where I did my PhDs in the mid-80s, most of the 40-ish male academics could be seen in older departmental photographs sporting heroically luxuriant 70s facial fuzz. Sir John Sulston is one notable British scientist who keeps this tradition alive.

The 1990s figure probably resembles my own generation of cell physiology people, though it would only actually have been me during a rather abortive Sabbatical year doing molecular biology in the late 90s. Most of my experimental work, back when I still used to do some, was with large microscopes in small dark rooms. These bolt-holes had the added advantage of being good for dozing, and for hiding from the students, or from the Head of Department when he wanted to sign you up for his latest scheme. One thing that was (and remains) characteristic of science academics, at least in the North of England, is the triumph of new fabric Polar-fleece type outdoorwear over the traditional woolly jumper; the latter is now only seen on the most old-fashioned among us.

Finally, the 2001 picture is doubtless pretty self-explanatory. Though a question arises:

If this was how we all felt about the amount of bureaucratic bullsh*t we had to put up with a full ten years ago… how big would that pile of papers be now?

—————————————————————————————————————-

*Sorry the image is so poor, but I originally drew the cartoon with the stylus on the drawing programme of the old Psion 5mx palmtop computer (anyone remember those?). Couldn’t find a digital version so I’ve had to cut ‘n’ paste it (with a bit of fiddling) from the online PDF version (see p 22 here).

Advertisement

Of slime and childish curiosity

March 26, 2011

In which Dr Aust ponders slime. And scientific tendencies.


Reproduced from the wonderful xkcd.com, the comic strip that regularly captures the spirit and the reality of science

Last weekend the Aust entourage, including Junior Aust (aged six-and-a bit-well-nearly-seven-in-a-few-months) visited this event at one of the nearby museums, run by the people from Manchester University’s Life Sciences Faculty.

In the event you could, as it says, “Come on a tour of the human body” and learn “how the heart works and how your lungs help you breathe”, among other things.

Junior Aust was fairly unimpressed by the nice chaps with their two-electrode ECG trace, even when I told her it was one of the things dad gets his students to measure on each other. I think the ECG wasn’t participatory enough for her, as they weren’t allowed to wire up members of the public (a shame, really, but understandable).

I DID manage to persuade her to blow into the spirometer and have her Forced Vital Capacity measured – another of those things you can find me getting students to do in their lab classes. I also measured myself for comparison, though I’d already done my annual Hypochondrial Full-service Multi-parameter Respiratory Function Self-assessment while I was running the student classes earlier this Semester.

She was a bit more impressed with the video of the view of the inside of your airways during a bronchoscopy (not done live, before you ask!), which I was able to tell her was the kind of thing mummy used to do to patients.

But the thing that REALLY made a deep and lasting impression on Junior Aust was the “make your own mucus-alike slime” stand. Kitted out in disposable plastic pathologist-style apron and dashing purple nitrile gloves, she was helped to concoct some truly disgusting-looking greeny-yellow slime out of acrylic glue, water and food colouring. I reassured her that the yellow colour was just enough to made it look properly yellow phlegm-like and grungey, and she was given some of her confection (tied up in another nitrile glove, no plastic bags left) to take home.

Now, we assumed she would lose interest in the stuff quickly enough, but this turned out not to be the case. For the rest of the day we were repeatedly called into action to stop her turning the slime out over the table, or the chairs, or the floor. Despite our best efforts, small chunks of it made their way onto her and her brother’s clothes, and onto the furniture. Yum.

But then we made a truly catastrophic error.

** Warning – you may find the next bit slightly disgusting. **

In a moment of attempting to out-gross Jurior Aust, The Boss (Mrs Dr Aust) remarked “That slime” (which was now semi-congealed) “looks exactly like what was in Junior Two’s nappies when he was ill the other week*”

Oh dear.

Big mistake.

Big, biiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiig Mistake.

Huge.

For, thoughout the week since this conversation, we have been regaled daily (or indeed several times each day), by one or both children, with the useful information, faithfully and exactly repeated, of just exactly what Jr Aust’s slime resembles. Typically combined with a display of THE GLOVE, turned inside out so we can have a good look at the congealed yellow stuff.

Nice.

Note to self:

Take care what information thou doth impart to those under seven.

For verily, thou canst not take it back.

Anyway, we are trying to look on the optimistic side. You certainly have to applaud Junior Aust, and her younger sibling, for their impressive curiosity. Even curiosity into slightly gross stuff.

Which explains why I found the cartoon at the top of the post, from the brilliant xkcd.com, so funny when I saw it earlier this evening.

Now, Mrs Dr Aust and I have sworn an oath, in blood and in at least two languages, that the Aust-Sprogs are to be discouraged at all costs from going into any career related to science, or into medicine.

But there is, I fear, the chance that genes, or conditioning, will out.

Time, I guess, will tell.

 

—————————————————————-

* It was almost certainly a rotavirus infection, BTW. Most unpleasant, and not a week we are keen to remember.

Universities need arts as well as science

December 6, 2010

In which Dr Aust notes that scientists, on the whole, do not think that Universities should only have science in them.

In Universities up and down the UK, University managers are considering the implications of the Government’s funding cuts.

All right all right… I KNOW we haven’t had the Parliamentary vote on raising tuition fees yet (coming this Thursday). And I also know that, in Dr Aust’s University and in many other comparable ones, the senior brains trust is perhaps hoping that they will get to charge the students a much-increased fee which will replace the lost direct funding. I know that.

But, as many people have already noted, the cuts in the direct funding are already written into the Treasury’s spreadsheets.

And most Universities are planning for significant real-terms cuts in the budget, whatever happens on Thursday.

Anyway… where was I?

Oh yes.

In Universities up and down the UK, University managers are considering the implications of the Government’s funding cuts.

In particular, the near-total cut of direct teaching funding for arts and many languages has people predicting that Universities will cut whole departments. The Arts Faculties are definitely nervous –and who can blame them.

Let me give you an example: I heard of one University where the science faculty declined to even circulate an announcement about the “Science is Vital “ campaign – the reason widely believed to be that the bosses didn’t want to send the University’s Arts Faculty a signal that scientists thought only science was important. Not that scientists DO think that – they don’t, on the whole – but the arts and humanities people are generally thought to be so twitchy that a “wrong signal” might spread mass panic. The “goodbye arts” idea is certainly widely prevalent among academics gossiping in places like the Times Higher Education comments threads.

Interestingly, the same pressures seem to be abroad in that bastion of the free market in University education, the USA. Conservative governments in the UK have never made any secret of their admiration for the US free market model in all things, and that definitely includes higher education. The fact that some US Universities are shutting arts programmes is thus hardly likely to bolster the confidence of arts academics in the UK.

However, there is at least one eloquent defence of arts programmes doing the rounds, spread from email inbox to twitter to email these last few weeks.

What is interesting about this one is that it comes from a scientist – the eminent enzymologist Greg Petsko, who works at Brandeis University in Massachusetts.

Petsko’s article is entitled “A Faustian Bargain”. In it he eviscerates, in a piece of sustained and forensic mockery, the President of the State University of New York at Albany (SUNY Albany), who announced the closure of several art programmes and departments. Petsko makes many telling points, among them that a broad education, including the arts, is actually useful to scientists. He also manages to skewer the tendency of all too many University leaderships to manage by fait accompli. Here is a sample:

“You did call a [University] “town meeting”, but it was to discuss your plan [for Department closures], not let the university craft its own. And you called that meeting for Friday afternoon on October 1st, when few of your students or faculty would be around to attend…

It seems to me that the way you went about [this] couldn’t have been more likely to alienate just about everybody on campus. In your position, I would have done everything possible to avoid that. I wouldn’t want to end up in the 9th Bolgia (ditch of stone) of the 8th Circle of the Inferno, where the great 14th century Italian poet Dante Alighieri put the sowers of discord. There, as they struggle in that pit for all eternity, a demon continually hacks their limbs apart, just as in life they divided others.

The Inferno is the first book of Dante’s Divine Comedy, one of the great works of the human imagination. There’s so much to learn from it about human weakness and folly. The faculty in your Italian department would be delighted to introduce you to its many wonders – if only you had an Italian department, which now, of course, you don’t.”

Petsko repeatedly uses the final motif –

“- if only you had an XYZ department, which now, of course, you don’t.”

– to skewer the Albany President mercilessly. He then goes on, near the end of the piece, to say the following – which should ring a loud bell with anyone who has been following the proposed changes to teaching funding in the UK Universities:

“As for the argument that the humanities don’t pay their own way, well, I guess that’s true, but it seems to me that there’s a fallacy in assuming that a university should be run like a business. I’m not saying it shouldn’t be managed prudently, but the notion that every part of it needs to be self-supporting is simply at variance with what a university is all about. You seem to value entrepreneurial programs and practical subjects that might generate intellectual property more than you do ‘old-fashioned’ courses of study. But universities aren’t just about discovering and capitalizing on new knowledge; they are also about preserving knowledge from being lost over time, and that requires a financial investment. There is good reason for it: what seems to be archaic today can become vital in the future.”

Petsko then gives two examples, one from science and one from arts and humanities. They are virology, which was in decline in the 1970s until HIV suddenly threw the shortage of virologists into sharp relief and gave the subject a new urgency; and middle eastern languages and culture, which were sparsely taught until the events of September 11th 2001 and their aftermath.

He continues:

“I know one of your arguments is that not every place should try to do everything. Let other institutions have great programs in classics or theater arts, you say; we will focus on preparing students for jobs in the real world. Well, I hope I’ve just shown you that the real world is pretty fickle about what it wants. The best way for people to be prepared for the inevitable shock of change is to be as broadly educated as possible, because today’s backwater is often tomorrow’s hot field. And interdisciplinary research, which is all the rage these days, is only possible if people aren’t too narrowly trained. If none of that convinces you, then I’m willing to let you turn your institution into a place that focuses on the practical, but only if you stop calling it a university and yourself the President of one. You see, the word ‘university’ derives from the Latin ‘universitas’, meaning ‘the whole’. You can’t be a university without having a thriving humanities program. You will need to call SUNY Albany a trade school, or perhaps a vocational college, but not a university. Not anymore.”

———————————————-

Are liberal arts degrees a solution to the UK funding problem?

Petsko makes various references to the liberal arts educational model – common in US undergraduate degrees – where students take a broad spectrum of courses. This is something my friend Prof David Colquhoun has been writing about recently as a possible part solution here in the UK.

In Petsko’s view, these set-ups provide a way that more “profit-making” subjects (or, in the UK context, ones the Govt is still going to provide some teaching funding for) could subsidise subjects which make less money. For instance, if students taking science as their “major” subject were also required to take courses like composition, and/or rhetoric, then you could have people in, for instance, Classics departments whose teaching duties might primarily be teaching rhetoric to non-classics students. There is even a sort of parallel here with “service” teaching in science departments. This is a system, common historically in UK Universities with medical schools, where some people in the science departments mainly teach subjects like physiology or pharmacology to medical and other health science students.

As the cuts have loomed larger, there have been many eloquent defences of both the intrinsic value of the arts, and also of the economic usefulness of subjects other than hard sciences. For instance, Kieron Flanagan recently pointed me to this defence of humanities and social sciences. And there is Stefan Collini’s truly magisterial deconstruction of the Browne Review, on which the Govt’s proposed changes are based, in the London Review of Books here.

However, let’s stick to science and University science teaching – on the basis that one should concentrate on talking about things one knows something about. The central point that I would make, along with Petsko, is that studying science – or, at least, studying for a modern science BSc degree – does not teach you everything that scientists need to know. You might, indeed, get some of the other stuff from things like the arts. Or from literature. Or from reading newspapers. Or from writing, and communicating, about science to non-scientists

And again; as a scientist, I find the argument that a scientific training and education is useful entirely, or even primarily, because it is “vocational” quite flawed. It is a commonplace among me and my scientific colleagues that the primary value of our degree is NOT entirely, or even particularly “vocational”, i.e. in training more scientists. The value lies in training critical thinkers who also happen to be scientists. But training critical thinkers is something that all academic disciplines hopefully do – indeed, I would see it as a key purpose of all Higher Education. I am quite certain the arts and humanities pride themselves on instilling critical thinking, as well as producing “lifelong learners”, and all the other buzzwords.

Finally, there is the question on how the culture of Universities will change, if the arts are hit hard. This was, of course, where we started with Prof Petsko’s satirical tour-de-force. But I will leave the last word to an eminent British scientist and Professor I know, writing in the pages of the Times Higher Education a few months back. His short letter does not have Petsko’s rhetoric, or sustained scorn and humour, but it serves equally to make the point that scientists do not generally think that Universities should only do science:

“…..As with every time new [higher education] “world rankings” are published, I find myself scratching my head.

Am I missing something? Card-carrying professional scientist that I am, it still completely eludes me how institutions such as the Massachusetts Institute of Technology or the California Institute of Technology (or even our own Imperial College London [1]), which, as far as I know, have absolutely no arts faculties of any size, shape or form, can possibly be considered superior “universities” to the likes of Oxford, Cambridge, the University of California, Berkeley or Yale.

Did someone change what a “university” is while I wasn’t looking?”

To which the answer seems to be:

“No, but the UK coalition government seem to be inclined to give it a try.”

I do hope, myself, that they don’t succeed.

————————————————————

[1] Before Alice Bell tells me off, we should say that Imperial Colege haz humannities..it duz: see here.

Note: You can find a list of all Petsko’s columns, written for the journal Genome Biology, here – and a link to download a kind of eBook compendium of them (if you are an iPhone/iPod type) is here.

More Jenkins Junk

July 7, 2010

In which Dr Aust ponders whether Simon Jenkins makes it up as he goes along

The other day I was listening to Simon Jenkins on the weekly Guardian science podcast, discussing his latest predictable tirade against the scientists, and the “Spoofjenks” reaction to it (my own little contribution is below/here).

In the podcast conversation with the Guardian’s Alok Jha (it starts at about 22 min 45 sec in, and goes on until 31:15 or so), Jenkins seemed to be backtracking somewhat on his latest article. As I heard it he was arguing that:

(i) what he had said was all very mild;

(ii) all he really wanted to say was that science could not expect to be shielded from the UK public sector cuts;

(iii) scientists “don’t know how to ask for money properly” (by which I think he means they overstate the importance, and likely benefit, of their work)

Hmm.

I don’t buy it, really.

First, I’m not sure that calling the President of the Royal Society (or things that the latter had said) “shameless” and “two-faced” is all that mild. And Jenkins’ central point (as he tells it) that he was merely commenting on scientists wanting to be protected from the public sector cuts seems rather disingenuous, given the well-worn nature of Jenkins’ theme (“pointy-headed scientific experts and why I distrust them”, if I may shorthand it that way)

Anyway, there were a couple of stand-out moments in the conversation (at least for me).

One was when Alok Jha offered, as an illustration of the way in which science projects can yield unexpected benefits, the example of CERN and the World Wide Web. Computer scientist Tim Berners-Lee was a contractor at CERN, and later a Fellow, when he came up with the ideas that launched the WWW. Now, this is not a justification of CERN, but it certainly is an illustration of what most scientists believe, namely that you can’t really predict with any accuracy where key advances will come from.

As far as I could tell, Jenkins’ response to this was to say that he doesn’t believe it. He did, to be fair,  say some slightly less stupid things too. One was to suggest that the internet was an outgrowth of defence research. There is a fair amount of truth in this, as networking computers in remote locations together was certainly driven forward by projects like ARPANET. This later morphed into network projects involving Univerities and scientific institutions, like the BITNET system I was using to send e-mail in the late 80s and early 90s.

Jenkins’ other gambit was to say:

“Well, if the CERN people hadn’t done it, someone else would have”.

This latter is, of course, likely true of all human discoveries; but to offer that as a reason not to fund science and scientists strikes me as spectacularly stupid. Someone has to discover things. It hardly seems desperately controversial (not to me, anyway) to say that having some of your brightest people work at being “professional discoverers” is a good way to do it. What hard evidence there is available seems to bear this out.

It is also, I think, a good idea to have your professional discoverers work in a system where they talk to one another and disseminate, by publishing, what they have found, so that anyone else can make use of it. Many a human discovery has been made, and then lost or forgotten, and then had to be re-discovered. The scientific literature system now makes this a bit less likely.

What really struck me about this exchange was that Jenkins clearly didn’t know the slightest thing about what Berners-Lee had actually done, or even (slightly more surprising to me) the difference between the earlier computer networks and the WWW. Apart from Berners-Lee’s role being rather well known, anyone – e.g. anyone who writes recurring columns denouncing scientists for being a bunch of smug parasites – can go and read about it on Wikipedia under “History of the Internet”

Come back Homer Simpson – we need you as a well-paid columnist

However, the real “D’oh!” moment came when Jenkins started talking about what he saw as research that justified its costs, and research that didn’t. He contrasted research into Alzheimer’s Disease (which he said he thought led to real tangible advances in understanding the brain) with research into cancer, which he seemed to regard as a bottomless pit into which money was poured for no results.

Rather odd, I thought.

Because research into Alzheimer’s has, as yet, had little impact on the actual disease. That’s little as in “essentially none”. As Wikipedia puts it:

“There is no cure for Alzheimer’s disease; available treatments offer relatively small symptomatic benefit but remain palliative in nature.”

We do know a vast amount more about the underlying pathophysiology of Alzheimer’s, and the biology of Amyloid Precursor Protein, then we did a dozen years ago. But so far this greater knowledge has yielded no noticeable improvements in therapy. The cholinesterase inhibitor drugs that are on the market to treat Alzheimer’s are widely regarded as pretty useless, or close to it (pace Terry Pratchett), the experimental drug therapies have so far been a disappointment, and a commenter at the Guardian podcast page made the Pharma in-joke that:

“Alzheimer’s research has been such a failure that Pharma companies are actually cooperating with each other to try to make progress”

Now, personally I wouldn’t call the research a failure, exactly; as I said, we now know much more, but treatment breakthroughs have not been forthcoming. The same is true for Cystic Fibrosis, for instance. It is also true of Huntington’s Disease and many other neurological and neurodegenerative conditions.

Anyway, the sad reality is that we can do little currently in terms of preventing or slowing Alzheimer’s, and we certainly cannot cure it. The best advice seems to be to take a daily brisk walk, watch your blood pressure and lipid profile, and play chess or Sudoku to keep your brain active.

In contrast, if we talk about cancer, Jenkins’ other example, things are a bit different. You could have a look, for instance, at this.

Or this.

Now, this improvement in cancer survival rates may not be mostly the result of what you think of as lab-based basic research. It may be more about painstaking clinical research to optimize drug regimes, or to refine procedures for surgery or bone-marrow transplant. It may be to do with better diagnostic techniques, like MRI and CT scanning, that allow earlier diagnosis and treatment and hence better outcomes. But of course, all those processes stand on the shoulders of basic research at some stage – and typically at multiple stages.

More importantly for the current discussion, the statistics – the real numbers – show that there are incremental year-on-year improvements in cancer survival. And if you cast the timescale back further, the gains are even more apparent, as you can see for childhood cancer here, from where I took this Figure:

UK survival stats for childhood cancer

Five year survival statistics for children diagnosed with the indicated cancers in the years 1962   to 1996.

So contrast:

Alzheimer’s – no improvement in treatment.

Cancer – year on year improvements. Slow but steady.

But Jenkins likes the first, where he thinks he sees recognisable benefits of research, and not the second.

Now, one can make a more complex argument about “value for the amount of money spent”, of course, and there is certainly a lot more money devoted to cancer research than to dementia research. But still, Jenkins’ comments seem completely… well… uninformed.

[Update: 12th July – Cancer Research UK have just released new figures showing that long term survival after a number of hitherto very deadly cancers has doubled since the 70s. Of course there are still some cancers where survival has not improved, like pancreatic cancer. But I doubt anyone would believe that less research is going to make that better. The figures are here.]

So  – what would you conclude from Jenkins’ use of these examples?

Well, I am left concluding that what he thinks matters is whether scientists talk in terms that “speak”, personally, to Simon Jenkins. Or, one could say, terms which speak to his prejudices.

And I am also left concluding that he pretty much makes this stuff up as he goes along.

(Not his basic boilerplate article railing against science, of course – he seems to recycle that one with gusto. Indeed, if I were really going to spoof him with conviction I guess I should reprint my piece from last week every six weeks or so, with only minor cosmetic alterations).

Be afraid – be very afraid…

Now this “making it up as you go along” is, of course, a long-standing tradition in Britain, both in certain social circles and in comment journalism. So perhaps we should expect nothing else.

The nagging worry is that there might be some people out there who take Jenkins’ views on science, and scientists, seriously. Just last week I heard one worried senior scientist refer not just to Jenkins’ article, but to Jenkins “influential readership”.

The obvious implication was that politicians, top mandarins, the media and other members of the Great and the Good, are where one finds Jenkins’ readers.

And in an era when the Tories have just put an Evangelical Christian with a tendency to invent her own statistics and facts, and a man who believes astrology can help surgeons get better operating results, onto the House of Commons Health Select Committee… …that thought really does worry me.

It worries me quite a bit.