My family are very clever... - Sally's Journal
My family are very clever...|
My cousin chrisiddon
* has written a public infomation / mini-rant on the cholesterol drug that's just been in the media
You can find Chris' review here
, and I think it's rather interesting.
Not only in the "this is the other stuff they don't tell you about their wonder-drug" kind of way, but because it's exactly the behaviour of the media this bad science article
was complaining about. Rather depressing...
*My family is very clever, and some of them manage not
to get thrown off their PhDs and end up with Qualifications and Know Stuff. There is a bad tendancy for us then to end up as builders assistants and oxfam shop clerks, but we're working on it :-)
We do a lot of research on statins here.
|Date:||March 15th, 2006 10:10 am (UTC)|| |
Err, that would be because you're at Sheffield. Where Chris did his PhD ;-)
Oh, cool. We probably know a lot of the same people. :-)
Very interesting rant!
My Dad is on Simvastatin to reduce his cholesterol; AFAIK his was caused by lifestyle (poor diet early in life methinks) and not by anything inherited. Well, I hope not, otherwise I'll have it! His cholesterol is now down at 3, whereas mine is at 4.5 or thereabouts. I think I'll just stick to a healthy diet and yoga and hope that works.
Oh, and the bad science article was good, but bashed humanities graduates a bit. Bah, I know some o' that scientomalogical stuff too! :P
|Date:||March 15th, 2006 10:22 am (UTC)|| |
I've done some work on statins. My main response to the news report was to sigh that yes, we knew most of that already...
|Date:||March 15th, 2006 10:41 am (UTC)|| |
I'm reminded on a model of how (science) (mis)communication works, which someone told me once:
1) Person A presents (carefully constructed) arguament
2) Person B:
a) Digests argument into a pool of ideas and factoids
b) Selects ideas and factoids that he (thinks he) understands, and forgets about the rest
c) Adds assumptions and random background knowledge
d) Reconstructs (semi-/in-)coherent picture based on this pool of selected ideas/facts/assumptions
The behaviour problem the media has is laziness. It's easier for a journalist to dumb down than it is for him to try to understand what is going on.
Never, never trust it on pharmaceuticals. The pharmaceutical industry knows that most of the media is lazy and tends to play to this by putting out press releases for journalists to copy straight (and then putting further pressure in through the advertising department).
(Note: this behaviour is not unique to the pharmaceutical industry and media reporting of it.)
|Date:||March 15th, 2006 01:18 pm (UTC)|| |
To expand on "laziness":
There are three ways in which a science article reaches the press; or possibly two ways, with a slight variation in one of them.
I) A university or company R&D department puts out a press-release which is copied near-verbatim into a newspaper, possibly with noncommittal "reaction" comments from other scientists in the field who (quite rightly) aren't going to say anything of substance until they've seen the results published. These articles are almost invariably bad.
I.5) Alarming/Exciting early results from a large study are published in an "advance communication" which is picked up on by a journalist who doesn't understand the nature of such publications, and promulgate as "A recent study says..."
II) The science editor, editor, or features editor reads an article in New Scientist and assigns somebody to plagiarise it creatively. This leads us into the debate as to the merits of Reed Elsevier's publication. NS is relentlessly optimistic; that's an intentional editorial policy and there's nothing wrong with that, but when newspapers pick the same stories up and run with them, they generally fail to mention that whilst the foundational science is largely sound, several layers of speculation have been added.
|Date:||March 15th, 2006 01:24 pm (UTC)|| |
we just like learning stuff, but don't have that vocational drive.
Thanks for the link, the number of hits on my site has rocketed today!
|Date:||March 15th, 2006 01:37 pm (UTC)|| |
Re: that's cos
Hey, at least you have jobs, instead of a mad portfolio of part time work...
(We would have vocational drive, but Dr Beckett stole it all, is my theory ;-) )
Enjoy the fame :-)
I was about to go "Ha, economics gets badly reported as well." Then I looked at the BBC news website and realised that they'd done quite well. I suppose this would be because economics stories tend to be reported by journalists with Economics degrees degrees.
Social science stories tend to go a bit weird. They suffer from the same "Here is a random statistic from a very small sample of people. I will multiply it by a million to make it representative of the UK population. I am sure the fact that the sample was made up entirely of female pygmies from Hampstead is not important." mentality. On one of my training days we ripped to shreds an article from the Times on a YouGov poll.
All the same, I think it is important to get non-specialists to have a look at stories to check how comprehensible they are to someone who stopped doing science or maths at the age of 16. If I have to write something at work for non-economists I take a draft home to my medievalist house mate who makes helpful comments like "Why are you assuming that the probability of x happening is not more than 1."
I think the best system is specialist writes something. Non-specialist makes it comprehensible to non-specialists. Specialist checks what non-specialist has written to make sure it bears a passing resemblance to what he originally said. This is kinda what we do when specialists give information to generalists (arts graduates) to put into ministerial briefings.
I think the best system is specialist writes something. Non-specialist makes it comprehensible to non-specialists.
I hear this a lot, particulary without your third step ("Specialist checks"), and often in regard to software documentation/project management. It tends to irritate me.
Why is it automatically assumed that a non-specialist is somehow more capable of communicating the relevant ideas than an expert? Yes, it's useful to have a layman as a proof reader and level-check, but it's hardly essential if you can a) write and b) actually understand the jargon you'd use in discussions with a peer. (Discussion of the writing/presentation skills of a typical scientist omitted...)
From my point of view, it makes more sense for an expert to explain a concept in simple terms than for a novice to take a complex explanation and attempt to simplify it. If the result is still too complex (or so simple as to be patronising) then it's better for the expert to redraft and pitch it at a different level/audience, unless they're completely incapable of doing so.
Why is it automatically assumed that a non-specialist is somehow more capable of communicating the relevant ideas than an expert?
It isn't. It's assumed that an expert will not necessarily know the limits of a non-specialist's knowledge, and that therefore having a non-specialist providing significant input as to (a) where it is incomprehensible and (b) what he understands it to mean (the purpose of the re-write) is a good thing. But remember that in the model described, it is the specialist that gets ultimate right to approve.
It isn't. It's assumed that an expert will not necessarily know the limits of a non-specialist's knowledge
Which strikes me as a daft assumption to make, given that it's a simple test ("Does my audience know about X?") to apply oneself and simple consequences ("Refer to X", "Don't refer to X", "Briefly explain what they need to know of X" or "Cite X" as applicable to the style and argument). There are many techniques to make this easier - such as creating profiles of imaginary readers - and many people have a problem in actually doing it but, fundamentally, it's the ability to write at an appropriate level which is important, not the amount of specific domain knowledge.
Also, in the model described, the non-specialist did a rewrite ("Specialist checks what non-specialist has written [...]"). While it's good (nay, essential) for the expert to have final approval, exactly how does it make sense for someone who doesn't fully understand the article to rewrite it?
There are plenty of reasons why such things might be necessary - time constraints on the expert, poor or slow lines of communication, for example - but it's a likely source of (possibly systematic) errors to have a non-expert simplify text.
But doesn't Goldacre have a point when he says that no such system is applied by people writing about literature, etc? Isn't it both reasonable and necessary to assume in writing an article on a certain specialist area that the readers of that article will have a basic grounding in that area?
|Date:||March 16th, 2006 01:37 pm (UTC)|| |
Literary criticism is not a matter of empirical fact, and bad arts writing isn't going to persuade gullible innocents into doing things that are bad for their health.
|Date:||March 15th, 2006 09:32 pm (UTC)|| |
I think that Ben Goldacre appears to have quite a good grip on why the BBC Health section is shite:here
|Date:||March 17th, 2006 08:02 pm (UTC)|| |
If I don't see the word hypercholesterolaemia ever again I shall be a very happy bunny. Hrm, and I really should compose a rant on How to Write Abstracts.