Sunday, January 29, 2006

Bloggers Before There Was Internet

John Wilkes recently was biographied as the father of civil liberties for his struggles before the American war of independence. BrooklynDodger(s) haven't gotten to the end of this now being published book. One of the BrooklynDodger(s) personalities now considers Wilkes as an avatar.

The point is that Wilkes wrote and caused to be published the North Briton, an unsigned newspaper. As a personal series of political observations, North Briton was essentially a blog distributed on paper. Political society was limited enough then that a limited circulation paper impacted important people.

Wilkes fought a duel with a targetted politician over the issue of Wilkes refusal to admit writing an article attacking that politician.

[The duel was a draw, and the combatants went out drinking afterwards.]

Nevertheless, the value of anonymous political commentary has deep, deep roots.

Wednesday, January 25, 2006

Confounding of Work Organization and Physical Factors in Ergonomic Risk Assessment


Issue: Volume 49, Number 1 / 15 January 2006

Pages: 12 - 27

Work routinization and implications for ergonomic exposure assessment

Judith E. Gold A1, Jung-Soon Park A2, Laura Punnett A1

A1 Department of Work Environment, 1 University Ave, University of Massachusetts Lowell, Lowell, MA, 01854, USA
A2 Zen Buddhist Temple, 1710 W. Cornelia Ave, Chicago, IL, 60657, USA


Jobs in many modern settings, including manufacturing, service, agriculture and construction, are variable in their content and timing. This prompts the need for exposure assessment methods that do not assume regular work cycles. A scheme is presented for classifying levels of routinization to inform development of an appropriate exposure assessment strategy for a given occupational setting. Five levels of routinization have been defined based on the tasks of which the job is composed: 1) a single scheduled task with a regular work cycle; 2) multiple cyclical tasks; 3) a mix of cyclical and non-cyclical tasks; 4) one non-cyclical task; 5) multiple non-cyclical tasks. This classification, based primarily on job observation, is illustrated through data from a study of automobile manufacturing workers (n = 1200), from which self-assessed exposures to physical and psychosocial stressors were also obtained. In this cohort, decision latitude was greater with higher routinization level (p <>


[avatar for pyschosocial stress]

BrooklynDodger(s) Comments:
BrooklynDodger is giving in to this psychosocial stuff. As manufacturing jobs go away - jobs where there are stressors which can be objectively and quantitatively measured, then abated according to the hierarchy of controls - the Dodger thinks public health scientists have to look at the stuff euphemized as "work organization."

The basic framework for job-related psychosocial stress is the demand-control-support paradigm. The paradigm has a certain construct validity. The key document in work organization investigation is the job content questionnaire. The Dodger stepped up to venerable technique of self-experimentation and took the questionnaire. Frankly, it sucks, and especially so when applied to hourly factory jobs. The questionaire was developed before we knew about ergonomics, and that the assembly line workers were mostly working in pain. The "blue collar blues" had a physical as well as psychological basis. Maybe the psychological issues were caused by the stress of pain.

Back to the Job Content Questionaire. The Dodger didn't know the answers to the questions for the Dodger's own work setting, which is not a factory job. In auto worker groups, hourly people routinely answer all the questions wrong and them claim high job satisfaction.

The questions probe subjective responses to external conditions, and don't elicit information on those conditions. A next step would be context specific focus groups to derive observational criteria to get to the stressors implicit in the questionnaire. Questions about work related pain would help.

Now to the matter at hand.

The paper documents substantial reduced routinization over a 6 year period.

Multiple non-cyclic tasks sounds like skilled trades work, which are known to lack repetitive work stressors. It's not much of a surprise the ergo reports are less, if only because walking from one task to the next is time not spent pounding steel.

The last line quoted from the abstract is maybe the most interesting contrarian observation:

"In this cohort, decision latitude was greater with higher routinization level." Wow! what does that mean?

Tuesday, January 24, 2006

More on Flu Kinetics

So maybe BrooklynDodger(s) were too academic about the persistence infective avian flu material in the environment. Maybe the WHO public health message of "don't worry about chicken feces infected material after 7 days at body temperature or 35 days in a refrigerator" is the most practical expression, and, the Dodger hopes, science based.

While researching that post, the Dodger found this picture on the San Francisco Department of Public Health website. There's no caption telling the Dodger which are the viral particles [dots or rods or both] or what the martrix is. The Dodger hopes it's public domain.

The digressive search lead to another set of numbers to muse about:

"Human Symptoms

Individuals exposed and infected by birds with avian H5N1 influenza usually developed symptoms within one week. This time between H5N1 infection and symptoms, or incubation period, is usually 2 to 4 days, but in some cases disease occurred up to 10 days after exposure. Symptoms include high fever, muscle aches, cough, sputum production and shortness of breath. Abdominal pain and diarrhea can also occur."

The Dodger assumes these are canonical numbers propagated by CDC and WHO. "Some cases 10 days after exposure" is at least a real observation, the known longest time, although the Dodger wonders how to figure that out if there are chickens in the yard. Maybe best defined as longest time after first known contact with a sick bird? last known contact?

"Usually 2 to 4 days" is a lot of weasel words; could they have said "half the cases develop within 3 days" and some observation on shorted time and the time when further cases are rare [which we guess to be time after removal of contact from sick birds, but what about infected material.]

Monday, January 23, 2006

Flu Kinetics - Single Number Disinformation

The most recent WHO fact sheet contains this alarming but misleading statement [the alarming part isn't the misleading part, it's very alarming]:

"Highly pathogenic viruses can survive for long periods in the environment, especially when temperatures are low. For example, the highly pathogenic H5N1 virus can survive in bird faeces for at least 35 days at low temperature (4oC). At a much higher temperature (37oC), H5N1 viruses have been shown to survive, in faecal samples, for six days."

Brooklyndodger(s) comments: BrooklynDodger suggests this statement would be a lot more informative, and also true, if stated in terms of half lives or decay times. It's misleading to say that room temperature chicken turds are dangerous for 6 days, but then become safe at midnight.

The potential for infection might also be better understood if there were estimates of an ID50, infectious dose of virions for 50% of the population, with some notion of the population distribution. Actually, you'd want to know the virions per gram of turd, so as to calculate an ID50 in terms of turd grams.

Those two data sets would yield a probability of infection at a given time. That in turn might distinguish human to human transmission from transmission from chicken-free but chicken contaminated articles and materials. It would also yield a science based approach to flock culling, and a re entry time for new chickens into a formerly infected coop.

The Dodger guesses that these infective flu particles are virions, and not a virus infected cell of some type. [For example, legionella live and reproduce inside other microorganisms, and animal macrophage. Anthrax spoors live "forever" in dirt. The Dodger guesses that virions have to be wet.]

Theoretically, it's possible to say the infectious material has a true zero risk when the last virion decays. But practically, the Dodger guesses there's a first order decay process with a huge number of virions which never get to zero even at infinity.

[But, what about autoclaving, wouldn't that kill them all? That's a first order process too? The Dodger guesses you could guess the number of virions in the sample, 10 to the whatever power, and figure out how long it would take to get down to 1 and then 0.]

Yes, the Dodger says, there aren't infinite virions in the chicken turd, so there is a true zero. But it's a risk assessment number to be calculated. With a distribution.

Sunday, January 22, 2006

Dodger(s) Pluralism - Easy Way to Fill up a Post

In a previous post, the Dodger(s) raised the possibility of multiple personalities or even a multiplicity of persons being responsible for the farrago posted to this blog. After all, the Reveres claim to be 3 persons, one of whom might be thought to be male in a heterosexual relationship by a posted reference to a Mrs. Revere. But who is to say?

[interestingly, ferrago gains over a million google hits, while farrago, a correct spelling of something, only a few hundred thousand.]

Others might wonder whether the multiplicity of avatars, and sometimes the absence of an avatar reflect a multiplicity of bloggers. The Dodger(s) won't [which includes can't] say.

The chad counter conveys scrutiny, but the Dodger fears copywrite infringement. The twoFrancis Bacon's are cool, but likely only the old one is public domain.

Saturday, January 21, 2006

Developing World Diesel

Toxicological and Environmental Chemistry

Volume 87, Number 4 / 26 December 2005

Pages: 463 - 479

Polynuclear aromatic compounds in kerosene, diesel and unmodified sunflower oil and in respective engine exhaust particulate emissions

Joseph O. Lalah A1 and Peter N. Kaigwara A2

A1 Department of Chemistry, Maseno University, Maseno, Kenya
A2 Biomass Energy Section, Ministry of Energy, Nairobi, Kenya


Polynuclear aromatic compounds (PAC) were characterized in diesel fuel, kerosene fuel and unmodified sunflower oil as well as in their respective engine exhaust particulates. Diesel fuel was found to contain high amounts of different PAC, up to a total concentration of 14,740 ppm, including carbazole and dibenzothiophene, which are known carcinogens. Kerosene fuel was also found to contain high amounts of different PAC, up to a total concentration of 10,930 ppm, consisting mainly of lower molecular weight (MW) naphthalene and its alkyl derivatives, but no PAC component peaks were detected in the unmodified sunflower oil. Engine exhaust particulates sampled from a modified one-cylinder diesel engine running on diesel, kerosene and unmodified sunflower oil, respectively, were found to contain significantly high concentrations of different PAC, including many of the carcinogenic ones, in the soluble organic fraction (SOF). PAC concentrations detected at the exhaust outlet indicated that most of the PAC that were present in diesel and kerosene fuels before the test runs got completely burnt out during combustion in the engine whereas some new ones were also formed. The difference between the character and composition of PAC present in the fuels and those emitted in the exhaust particulates indicated that exhaust PAC were predominantly combustion generated. High amounts of PAC, up to totals of 52,900 and 4830 µg m−3 of burnt fuel, in diesel and kerosene exhaust particulates, respectively, were detected in the dilution tunnel when the exhaust emissions were mixed with atmospheric air. Significant amounts of PAC were also emitted when the engine was run on unmodified sunflower oil with a total concentration of 17,070 µg m−3 of burnt fuel detected in the dilution tunnel. High proportions of the combustion-generated PAC determined when the engine was run on diesel, kerosene and unmodified sunflower, respectively, consisted of nitrogen-containing PAC (PANH) and sulphur-containing PAC (PASH).


BrooklynDodger(s) Comments:

[Reader note: Although the Dodger has anonymized the Dodger's gender, reader(s) may wonder about the Dodger's individualism. Henceforth, the Dodger confesses to not revealing the number of bloggers who make up the Dodger. Certainly the diversity of subjects suggests more than one person, or, in the alternative, an adult ADD victim with too much time on the victim's hands. However, the Dodger(s) don't have enough time to put the (s) at the end of each usage.]

This paper comes form Kenyan colleagues. The problem in figuring out the least dangerous combination of internal combustion engines and fuels is a common experimental protocol for emissions. The Dodger yearns for a data set comparing natural gas v. gasoline v. diesel from engines with potential mass production control systems.

Anyway, this abstractly a 1 cylinder diesel engine, so it needs extrapolation to light and heavy duty vehicle diesels.

Diesel and kerosene are similar MW but with different refining goals and methods. The US is transitioning from high to low sulfur diesel, the Dodger might expect the Kenyans to have high sulfur. The Dodger has at times wondered the chemical form of the sulfur in diesel, here's an actual compound id'd. The Dodger has to look up carbazole.

In this contest, kerosene won, diesel fuel lost, and edible oil came out in between. The PAC's came from combustion. The nitrogen for the PAC's comes from thin air, nothing to be done about that. The sulfur in the diesel might be taken out by refining changes. Where is the sulfur coming from for the edible oil? Edible oil would have less heat content, being already partially oxidized. How they combust edible oil with such lower vapor pressure than the hydrocarbon mystifies the Dodger.

[blogger refused to upload bmp's of carbazole or dibenzothiophene, or paste the image. Picture dibenzofuran with a sulfur [for thiophene] or a nitrogen [for carbazole.] These are 6-5-6 ring heterocycles, in contrast to dibenzodioxins, which are 6-6-6 heterocycles with two oxygens.]

Friday, January 20, 2006

Intelligent Design and the Big Bang

Science 6 January 2006:
Vol. 311. no. 5757, pp. 54 - 57


The Distance to the Perseus Spiral Arm in the Milky Way

Y. Xu,1,2,3 M. J. Reid,2 X. W. Zheng,1,2 K. M. Menten4

We have measured the distance to the massive star–forming region W3OH in the Perseus spiral arm of the Milky Way to be 1.95 ± 0.04 kiloparsecs (5.86 x1016 km). This distance was determined by triangulation, with Earth's orbit as one segment of a triangle, using the Very Long Baseline Array. This resolves the long-standing problem that there is a discrepancy of a factor of 2 between different techniques used to determine distances. The reason for the discrepancy is that this portion of the Perseus arm has anomalous motions. The orientation of the anomalous motion agrees with spiral density-wave theory, but the magnitude of the motion is somewhat larger than most models predict.

1 Department of Astronomy, Nanjing University, Nanjing 210093, China.
2 Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138, USA.
3 Shanghai Astronomical Observatory, Chinese Academy of Sciences, Shanghai 20030, China.
4 Max-Planck-Institut für Radioastronomie, Auf dem Hügel 69, 53121 Bonn, Germany.

Massive stars and their associated bright regions of ionized hydrogen trace the spiral arms of galaxies. However, for our galaxy, the Milky Way, our view from the interior makes it difficult to determine its spiral structure. In principle, one can construct a simple model of the rotation speed of stars and gas as a function of distance from the center of the Milky Way. Then, if one measures the line-of-sight component of the velocity of a star or interstellar gas, one can determine its distance by matching the observation with the model prediction (that is, a kinematic distance). Knowing the distances to star-forming regions, one can then locate them in three dimensions and construct a "plan view" —a view from above the plane—of the Milky Way. Unfortunately, many problems arise when constructing a plan view of the Milky Way, including (i) difficulties in determining an accurate rotation model (which requires values for the distance and orbital speed of the Sun from the center of the Milky Way), (ii) distance ambiguities in some portions of the Milky Way (where an observed velocity can occur at two distances), and (iii) departures from circular rotation (as might be expected for spiral structure). Progress has been made on the first two problems. For example, many kinematic distance ambiguities can be resolved by interferometric studies of hydrogen absorption at radio frequencies, because distant sources will show a greater velocity range for hydrogen absorption than will near sources (1). However, the third problem, noncircular motions, is fundamentally much harder to address


BrooklynDodger comments: For all the controversy over Intelligent Design, there seems little contest over the origin of the universe, the real creation. Why are there no fundamentalist efforts to substitute 7 day creation for cosmological theories? The Big Bang gives at least a metaphorical role for a Deity [the Big Banger]. However, if the Banger just set a machine in motion, with no further supernatural intervention, there's little point in hoping that going to church on Sunday will get you a better deal on Monday.

The Dodger thought this story was cool because the star map put the world in its place. Our solar system looks to be in a not very dense region of the galaxy, and pretty far out from the center. Stars are still forming, but matter is not still being created.

Thursday, January 19, 2006

Noise, Blood Pressure and Cardiovascular Disease

Noise, Blood Pressure and Cardiovascular Disease

Recently, a systematic review of the scientific literature was published in the journal maintained by the National Institute for Environmental Health Sciences[1]. The investigators conducted a meta-analysis of 43 epidemiologic studies published between 1970 and 1999 that investigate the relation between noise exposure (both occupational and community) and blood pressure and/or ischemic heart disease (International Classification of Diseases, Ninth Revision, codes 410-414). The investigators studied a wide range of effects, from blood pressure changes to a myocardial infarction. With respect to the association between noise exposure and blood pressure, small blood pressure differences were evident.The investigators concluded that their meta-analysis showed a significant association for both occupational noise exposure and air traffic noise exposure and hypertension: We estimated relative risks per 5 dB(A) noise increase of 1.14 (1.01-1.29) and 1.26 (1.14-1.39), respectively. The investigators concluded that noise exposure can contribute to the prevalence of cardiovascular disease, but that the evidence for a relation between noise exposure and ischemic heart disease is still inconclusive because of the limitations in exposure characterization, adjustment for important confounders, and the occurrence of publication bias. They noted that the literature suggests that noise induced cardiovascular effects must be seen as a consequence of stress.

BrooklynDodger Comments. “Meta-analysis” is a recent scientific procedure to draw conclusions from a diverse group of scientific studies. Certainly 43 studies of noise, hypertension and high blood pressure should be enough to settle the question about whether there is a relationship. These Dutch governmental investigators concluded that noise exposure causes high blood pressure. High blood pressure, unless treated, leads to heart disease and stroke.

Progress on noise control stalled in the 1980’s, when the OSHRC ruled that employers could rely on hearing protection devices in most situations. NIOSH since has concluded that HPD’s provide only about 10 dBA of noise reduction, and that a significant fraction of persons exposed at 80 dBA will suffer hearing loss.[2] Therefore, the practical limit of HPD’s is really 90 dBA. Nevertheless, many people take the attitude that occupational hearing loss is not enough of a material impairment to health that it’s worth preventing.

These new studies provide the start for a quantitative exposure response relationship between noise and severe or life threatening health conditions. They should energized the campaign for reducing noise exposure through engineering controls.

[A little snarky, but if this is a "systematic" review, what would an "unsystematic review be?]

[1] van Kempen EE, Kruize H, Boshuizen HC, Ameling CB, Staatsen BA, de Hollander AE. The association between noise exposure and blood pressure and ischemic heart disease: a meta-analysis.” Environ Health Perspect. 2002 Mar;110(3):307-17.

[2] NIOSH, CRITERIA FOR A RECOMMENDED STANDARD, Occupational Noise Exposure, Revised Criteria 1998 DHHS (NIOSH) Publication No. 98-126

Wednesday, January 18, 2006

Mechanistic Exploration of Carcinogen Exposure Response Relationships

Risk Analysis, Vol. 25, No. 6, 2005

Heterogeneity of Cancer Risk Due to Stochastic Effects

Wolfgang F. Heidenreich∗

Persons with exactly the same genetic background, behavior, environment, etc. may have
differences in cancer risk due to a different number of cells on the way to malignancy. These
differences are estimated quantitatively by using the two-stage clonal expansion model. For
liver cancer the estimated relative risk for persons without intermediate cells at age 40 is less
than 10% when compared to the risk of the total population, while the top 0.1% risk group
has a more than 100-fold risk compared to the population. The risk of the 1% percentile in
risk is more than 100-fold of the risk of the more than 95% persons without intermediate cells.
The number of intermediate (premalignant) cells in the risk groups cannot be calculated from
incidence data only because they depend strongly on a nonidentifiable parameter. But under
plausible assumptions, less than about 1,000 intermediate cells are present at age 40 even in
high-risk persons.

KEYWORDS: Heterogeneity; risk; stochastic cancer model; premalignant cells


Brooklyn Dodger Comments: The Dodger's imaginary friends at Effect Measure posted an interesting series on carcinogen risk assesment, mostly on the extrapolation of risk from laboratory bioassays, largely in rodents, to people. It's a useful, short and plain language attempt at framing the policy debate. The Dodger commends this to readers, and also to the Reveres for moving beyond the flu.

Repeating the Dodger's past posts, the key issues in public health protection from cancer causing chemical exposures are:

Does carcinogenicity in laboratory tests (mostly in mice and rats) predict carcinogenicity in people (hazard identification)

Does carcinogenicity at high doses (in laboratory or in people) predict increased carcinogenic risk at an lower dose (qualitative exposure response assessment)

Is the exposure response relationship shallow or steep (shallow meaning less reduction in risk for reduction in exposure, a lower exponent for the curve) ?

The Dodger repeats that for two chemical agents for which we have the opportunity to observe human health effects at over a wide range of exposures, the risk appears at the lower exposures. These are environmental tobacco smoke v. direct smoking, and occupational asbestos exposure v. take home house hold and community.

The shape of the exposure response relationship can't be observed directly in the lower dose range, lower meaning 1/25 of the effect level. For most lab studies there's at most two observable effect levels, giving unit risks, and a no observed effect level. For people studies, some additional rates may be observed, but the quantitative exposure is way less certain. Models estimate the risk in the unobservable range.

The linearized multi stage model, and the one hit, two hit, log probit, weibull and other models are based on an equation giving mathematical expression to a metaphorical account of cancer initiation. These may diverge by large amounts in the unobservable low dose range; the linearized multi stage model tends to give the higher lower dose estimates because of low dose linearity.

The two stage clonal expansion model attempts incorporation of parameters for some biological processes - cell replication and cell death. Two stage clonal expansion actually incorporates multiple stages but boils down to initiation and promotion. There can be only one rate determining step of a chemical process.

The issue for risk assessment is the distribution of parameters in a population - the exposure-response relationship is a population characteristic. Having observed carcinogenicity at a [relatively] high dose, is the relationship flat [higher low dose risk] or steep?

Trying to come to the point, this modeler points to a clone of a few hundred transformed cells being precurser to a full blown malignant tumor. In a handwaving way, this suggests subtantial inter individual variation in resistence to carcinogens, and thus a shallow exposure response relationship.

Monday, January 16, 2006

Chemical sneaks into Great Lakes

It predated test rules; threat is unclear



A little-studied fire retardant has accumulated in Great Lakes sediment and game fish for decades without detection, according to new research.

The discovery about Dechlorane Plus, which went into production in 1964, surprised federal regulators.

"If this was a brand-new chemical, it would probably never get through" the testing process to allow its commercial use, Linda Birnbaum, a leading U.S. Environmental Protection Agency expert on toxic chemical effects, told the journal Environmental Science & Technology for a Jan. 4 article about the study.

But it is unclear what, if any, threat there is to human health or the environment from Dechlorane Plus, used for more than 40 years as a coating for electrical wires and computer cables. The chemical's only U.S. manufacturer, Occidental Chemical Corp. (OxyChem) says it is safe for people and other animals and has no plans to conduct testing on the product, which is manufactured at a plant in Niagara Falls, N.Y.


BrooklynDodger Comments: Persistent organic pollutants, moderate molecular weight chlorinated hydrocarbon compounds, are still the toughest risk assessment task. DDT and other organochlorine insecticides near wiped out raptor birds, before Silent Spring caught public attention. Dioxin appeared as the doomsday chemical. Gas chromatography with an electron capture detector is pretty sensitive for OC pesticides. Dioxin analyses require gas chromatography with high resolution mass spec.

The OC pesticides are relatively high dose toxicants in laboratory studies. Dioxin is exquisitely toxic in the lab. But human exposure and body burden is way low compared to the levels where health effects are directly observable in the lab.

Well, back to dechlorane plus. The Dodger winkled a structure out of ChemID plus. Nothing but very high acute lethal doses in the data base. This looks like a material which would be very poorly absorbed in bolus doses. The LD50 results are even less informative because of no attempt to measure the tissue levels associated with lethality.

The bridged ring structure does evoke some pesticide analogues:

Dieldrin. Dieldrin is way more potent in acute toxicity. But as a lower MW compound it's probably more readily absorbed.

The Dodger is running out of commentarian gas here. The Dodger remembers dieldrin to be one of the really nasty OC pesticides, besides which DDT is pretty benign. It shares with declorane the 6 chlorinated bridged ring structure. Based on impression, you'd give this one a second look. Absorption after chronic exposure would be the exposure variable worth looking at, something a lot easier than a two year chronic bioassay.

Friday, January 13, 2006

Silica In Construction

Excessive exposure to silica in the US construction industry.

Ann Occup Hyg. 2003 Mar; 47(2):111-22.

Rappaport, S. M.; Goldberg, M.; Susi, P., and Herrick, R. F.

Abstract: Exposures to respirable dust and silica were investigated among 36 construction sites in the USA. Personal measurements (n = 151) were analyzed from 80 workers in four trades, namely bricklayers, painters (while abrasive blasting), operating engineers and laborers. Painters had the highest exposures (median values for respirable dust and silica: 13.5 and 1.28 mg/m(3), respectively), followed by laborers (2.46 and 0.350 mg/m(3)), bricklayers (2.13 and 3.20 mg/m(3)) and operating engineers (0.720 and 0.075 mg/m(3)). Mixed models were fitted to the log-transformed air levels to estimate the means and within- and between-worker variance components of the distributions in each trade. We refer to the likelihood that a typical worker from a given trade would be exposed, on average, above the occupational exposure limit (OEL) as the probability of overexposure. Given US OELs of 0.05 mg/m(3) for respirable silica and 3 mg/m(3) for respirable dust, we estimated probabilities of overexposure as between 64.5 and 100% for silica and between 8.2 and 89.2% for dust; in no instance could it be inferred with certainty that this probability was <10%.>


BrooklynDodger Comments: A forgotten story in occupational health is how NIOSH published a criteria document on silica in 1974 [] recommending an exposure limit of 0.05 mg/M3, based on pulmonary function test decrements. The Dodger remembers that maybe the PFT changes achieved statistical significance at lower cumulative exposures than x-ray changes. That was before silica was known to be a human carcinogen, and before we knew that an acceptable exposure level should be 1/10 of the NOAEL. NIOSH also recommended the boiler plate exposure monitoring and medical surveillance.

The Mort Corn OSHA issued an ANPR for silica [and lead]. Lead went forward under Eula Bingham. Cotton dust, arsenic, vinyl chloride...Silica went nowhere, although a lot of foundries and other facilities were cited for overexposures at 100 ug/M3. [respirable] Now, a proposal for silica awaits release from OMB.

Now the guru of statistical air sampling, Steve Rappaport publishes data showing rampant over exposures in the construction trades. These results are substantially higher than the dust exposures measured during the world trade center recovery operation.

Steve also quotes the ACGIH values of 3 mg/M3 for non crystalline silica dust and .05 for silica, in contrast to the OSHA PEL's of 5 and 100.

Wednesday, January 11, 2006

Shooting Children

From 1993 through 2000, an estimated 22 661 children 14 years old or younger with nonfatal FA injuries were treated in US hospital EDs. Assaults accounted for 41.5% of nonfatal FA injuries, and unintentional injuries accounted for 43.1%. Approximately 4 of 5 children who sustained a nonfatal, unintentional FA injury were reportedly shot by themselves or by a friend, a relative, or another person known to them. 1 of every 5 children who were wounded by a firearm gunshot died from that injury.
PEDIATRICS Vol. 113 No. 6 June 2004, pp. 1686-1692
Nonfatal and Fatal Firearm-Related Injuries Among Children Aged 14 Years and Younger: United States, 1993–2000
Gabriel B. Eber, MPH*, Joseph L. Annest, PhD*, James A. Mercy, PhD(image placeholder) and George W. Ryan, PhD*
* Office of Statistics and Programming, National Center for Injury Prevention and Control, Centers for Disease Control and Prevention, Atlanta, Georgia(image placeholder)Division of Violence Prevention, National Center for Injury Prevention and Control, Centers for Disease Control and Prevention, Atlanta, Georgia
Objective. To provide national estimates of fatal and nonfatal firearm-related (FA) injuries among children (image placeholder)14 years old and to examine the circumstances under which these injuries occurred.
Methods. For nonfatal FA injuries among children, we analyzed data on emergency department (ED) visits from the National Electronic Injury Surveillance System for 1993 through 2000. National estimates of injured children who were treated in hospital EDs were examined by selected characteristics, such as age, gender, race/ethnicity of the patient, primary body part affected, intent of the injury, the relationship of the shooter to the patient, where the injury occurred, and activity at the time of injury. For fatal FA injuries among children, we analyzed mortality data from the National Vital Statistics System for 1993 through 2000. Data from both sources were used to calculate case-fatality rates.

Results. From 1993 through 2000, an estimated 22 661 (95% confidence interval [CI]: 16 668–28 654) or 4.9 per 100 000 (95% CI: 3.6–6.2) children (image placeholder)14 years old with nonfatal FA injuries were treated in US hospital EDs. Assaults accounted for 41.5% of nonfatal FA injuries, and unintentional injuries accounted for 43.1%. Approximately 4 of 5 children who sustained a nonfatal, unintentional FA injury were reportedly shot by themselves or by a friend, a relative, or another person known to them. During this period, 5542, or 1.20 per 100 000 (95% CI: 1.17, 1.23), children (image placeholder)14 years old died from FA injuries; 1 of every 5 children who were wounded by a firearm gunshot died from that injury. Most FA deaths were violence related, with homicides and suicides constituting 54.7% and 21.9% of these deaths, respectively. For individuals (image placeholder)14 years old, the burden of morbidity and mortality associated with FA injuries falls disproportionately on boys, blacks, and children 10 to 14 years old. Both fatal and nonfatal injury rates declined >50% during the study period.
Conclusions. Although rates of nonfatal and fatal FA injuries declined during the period of study, FA injuries remain an important public health concern for children. Well-designed evaluation studies are needed to examine the effectiveness of potential interventions aimed at reducing FA injuries among children

Tuesday, January 10, 2006

Exhaled NO - A New Tool for Measuring Particle Effects?

Environmental Health Perspectives Volume 113, Number 12, December 2005
Exhaled Nitric Oxide in Children with Asthma and Short-Term PM2.5 Exposure in Seattle

Therese F. Mar,1 Karen Jansen,1 Kristen Shepherd,2 Thomas Lumley,2 Timothy V. Larson,3 and Jane Q. Koenig1

1Department of Environmental Health and Occupational Sciences, 2Department of Biostatistics, and 3Department of Civil and Environmental Engineering, University of Washington, Seattle, Washington, USA

The objective of this study was to evaluate associations between short-term (hourly) exposures to particulate matter with aerodynamic diameters <>2.5) and the fractional concentration of nitric oxide in exhaled breath (FeNO) in children with asthma participating in an intensive panel study in Seattle, Washington. The exposure data were collected with tapered element oscillation microbalance (TEOM) PM2.5 monitors operated by the local air agency at three sites in the Seattle area. FeNO is a marker of airway inflammation and is elevated in individuals with asthma. Previously, we reported that offline measurements of FeNO are associated with 24-hr average PM2.5 in a panel of 19 children with asthma in Seattle. In the present study using the same children, we used a polynomial distributed lag model to assess the association between hourly lags in PM2.5 exposure and FeNO levels. Our model controlled for age, ambient NO levels, temperature, relative humidity, and modification by use of inhaled corticosteroids. We found that FeNO was associated with hourly averages of PM2.5 up to 10-12 hr after exposure. The sum of the coefficients for the lag times associated with PM2.5 in the distributed lag model was 7.0 ppm FeNO. The single-lag-model FeNO effect was 6.9 [95% confidence interval (CI), 3.4 to 10.6 ppb] for a 1-hr lag, 6.3 (95% CI, 2.6 to 9.9 ppb ) for a 4-hr lag, and 0.5 (95% CI, -1.1 to 2.1 ppb) for an 8-hr lag. These data provide new information concerning the lag structure between PM2.5 exposure and a respiratory health outcome in children with asthma. Key words: airway inflammation, asthma, children, exhaled nitric oxide, particulate matter less than or equal to 2.5 µm, short-term exposure. Environ Health Perspect 113: 1791-1794 (2005). doi:10.1289/ehp.7883 available via [Online 8 August 2005]


BrooklynDodger Comments: This paper displays a new method for observing airway inflammation is more or less real time. The application of exhaled NO to measuring effects of fine particles or other airway irritants in the occupational environment in workers should be exploited.

Previously, the most feasible method of measuring "acute" effects was pre and post shift pulmonary function tests, mostly looking for changes in FEV1.0. This sealed the effects of cotton dust, tire curing fume, and metalworking fluids. But research grade PFT's are difficult to collect, pre and post shift testing is difficult to arrange, and exposure on the day of pre and post gets to be a challenge to measure. Blowing up a balloon seems easier to get done.

The 24-hour exposures associated with significant increased in exhaled NO were all well in compliance with EPA's NAAQS to PM 2.5.

EHP will download the full text article.

Monday, January 09, 2006

Replication vs. Precautionary Principle

Superfluous Medical Studies Called Into Question

By David Brown
Washington Post Staff Writer
Monday, January 2, 2006; Page A06

In medical research, nobody is convinced by a single experiment.

A finding has to be reproducible to be believable. Only if different scientists in different places do the same study and get the same outcomes can physicians have confidence the finding is actually true. Only then is it ready to be put into clinical practice.

Nevertheless, one of medicine's most overlooked problems is the fact that some questions keep being asked over and over. Repeated tests of the same diagnostic study or treatment are a waste -- of time and money, and of volunteers' trust and self-sacrifice. Unnecessary clinical trials may also cost lives.

All this is leading some experts to ask a new question: "What part of 'yes' don't doctors understand?"



BrooklynDodger Comments: Let's apply this principle to public health and carcinogen classification.

The current IARC rules require two separate bioassays [in the absence of other evidence] to call a chemical a 2b "possibly carcinogenic to humans." For example, diethanolamine caused increased liver tumors in male and female mice when applied to the skin, but no tumors in rats. Yet, because these separate experiments were done at the same time in the same NTP bioassay, the Working Group called the male and female mice one study, the evidence limited, and the chemical "not classifiable." The majority of the working group which voted inadequate included those who interpreted the rules narrowly [and incorrectly in the Dodger's opinion] and a faction that doesn't believe mouse liver tumors predict human cancer risk, even in female mice.

IARC (2000) Diethanolamine. IARC Monogr Eval Carcinog Risks Hum 77, 349-79.

The Catch-22 is that "everyone" knows that an NTP bioassay done under GLP will produce the same result every time. [Had rats shown an association, they might have bought that as a second study.] So there's no reason to do the second study. [Actually, it's really been done, where the dieathanolamide-fatty acid condensation products, and triethanolamine are carcinogenic in proportion to diethanolamine contamination.] So DEA is in IARC limbo for ever.

Which means that a proper risk assessment for the amount you absorbed from skin lotion with traces of DEA contaminant won't ever be done. Rest assured, there is a growing body of "mechanistic" data to support a Houdini risk assessment should this ever move forward.


The Dodger will return to the WaPo article in another post.

Sunday, January 08, 2006

Musings on Carbon Monoxide

Proc Natl Acad Sci U S A. 2003 Mar 18;100(6):3497-500. Epub 2003 Mar 5.

Click here to read Click here to read Click here to read
Neuroglobin protects the brain from experimental stroke in vivo.

Sun Y, Jin K, Peel A, Mao XO, Xie L, Greenberg DA.

Buck Institute for Age Research, 8001 Redwood Boulevard, Novato, CA 94945, USA.

Neuroglobin (Ngb) is an O(2)-binding protein localized to cerebral neurons of vertebrates, including humans. Its physiological role is unknown but, like hemoglobin, myoglobin, and cytoglobin/histoglobin, it may transport O(2), detoxify reactive oxygen species, or serve as a hypoxia sensor. We reported recently that hypoxia stimulates transcriptional activation of Ngb in cultured cortical neurons and that antisense inhibition of Ngb expression increases hypoxic neuronal injury, whereas overexpression of Ngb confers resistance to hypoxia. These findings are consistent with a role for Ngb in promoting neuronal survival after hypoxic insults in vitro. Here we report that in rats, intracerebroventricular administration of an Ngb antisense, but not sense, oligodeoxynucleotide increases infarct volume and worsens functional neurological outcome, whereas intracerebral administration of a Ngb-expressing adeno-associated virus vector reduces infarct size and improves functional outcome, after focal cerebral ischemia induced by occlusion of the middle cerebral artery. We conclude that Ngb acts as an endogenous neuroprotective factor in focal cerebral ischemia and may therefore represent a target for the development of new treatments for stroke.

BrooklynDodger comments: The Sago mine tragedy refocuses attention on carbon monoxide.

When the Dodger was in toxicology school, CO toxicity was mechanized as hypoxia arising from CO competing for hemoglobin binding sites. Even then we knew there was myoglobin - remember sperm whale myoglobin as the early illustration of 3-D protein structure deduced from x-ray chrystallography? But we weren't taught to think about myoglobin's role, which was binding oxygen in muscle cells, and Dodger guesses donating that bound oxygen to metabolic processes.

The lowest dose human health effect of CO is potentiating irregular heart beat among people with pre existing ischemia. To the Dodger, it would appear more plausibly a direct effect of CO on the muscle than an indirect effect of low oxygen. The CO is carried to the tissue by hemoglobin.

So a brief tour through medline revealed a new name, "neuroglobin," which is the same deal. Maybe it's CO binding to neuroglobin rather than myoglobin which causes these heart rhythm effects.

Saturday, January 07, 2006

Drinking and Divorce Among Geezers

Social Science & Medicine
Volume 61, Issue 11 , December 2005, Pages 2304-2316

Heavy alcohol use and marital dissolution in the USA

Jan Ostermann, Frank A. SloanCorresponding Author Contact Information, E-mail The Corresponding Author and Donald H. Taylor

Center for Health Policy, Law and Management, Duke University, Box 90253, Durham, NC 27708, USA

Available online 2 September 2005.

Using the first five waves of the US Health and Retirement Study, a nationally representative survey of middle-aged persons in the USA conducted between 1992 and 2000, we assessed the association between alcohol consumption and separation and divorce (combined as divorced in the analysis) for 4589 married couples during up to four repeated 2-yr follow-up periods. We found that drinking status was positively correlated between spouses. The correlations did not increase over the follow-up period. Discrepancies in alcohol consumption between spouses were more closely related to the probability of subsequent divorce than consumption levels per se. Couples with two abstainers and couples with two heavy drinkers had the lowest rates of divorce. Couples with one heavy drinker were most likely to divorce. Controlling for current consumption levels, a history of problem drinking by either spouse was not significantly associated with an increased probability of divorce. Our findings on alcohol use and marital dissolution were highly robust in alternative specifications.


[The other Francis Bacon]

BrooklynDodger Comments:
The Dodger hopes readers will not read too much into a continuing interest in health effects of alcohol consumption, or their absence. In this study, it appears that heavy drinking only causes divorce if your spouse doesn't drink.

The abstract leaves out the most important observations because of focus on the analytical aims. The HRS study sample consisted of persons aged 51–61 in 1992 and their spouses, if married. Only 189 or 4% of the sample divorced during the observation period, with the proportion falling to 0.5% during the last 2 year period. The Dodger guesses that couples making it to 51 and still married figure they have to live with whomever they have, and the longer they stay together, the better chance of finishing out that way.

Overall, 40% were abstainers, with a higher percentage of abstainers among women, and persons of both genders not divorced. Few women copped to heavy drinking, but 2.5% of the not divorced men, and 5.6% of the enventually divorced men did.