Sunday, June 3, 2018

anthropomics: Who wants Charles Murray to speak, and why?

anthropomics: Who wants Charles Murray to speak, and why?

Who wants Charles Murray to speak, and why?

Some years ago, I wrote a broad critique of The Bell Curve, that old Social Darwinist psychology tome from 1994 by the hereditarian psychologist Richard Herrnstein and conservative political theorist Charles Murray. It was in a very nice collection edited by Besteman and Gusterson (who ought to be a law firm, but are actually cultural anthropologists), called Why America's Top Pundits are Wrong.

             A few years later, Paul Erickson and Liam Murphy included it in their reader on the history of anthropological theory. In fact, the third edition of that reader (2010) actually began with Marx and ended with Marks.  That was pretty cool.  The fourth edition (2013) also started with Marx and included Marks, but had a couple of more readings after Marks.

             They kicked me out of the fifth edition (2016).  No hard feelings, though, because I'm cited in their companion volume, A History ofAnthropological Theory.  But I know why they did it, too.  My essay was very dated. It was criticizing a twenty-year-old bit of pseudoscience, which only old people remember.  Richard Herrnstein is dead.  Charles Murray is just a distant irrelevancy.

            Well, the joke's on them.  

Charles Murray is back again.  He had surfaced briefly a couple of years ago, when Nicholas Wade's racist anti-science bullshit called A Troublesome Inheritance was published.  That's the book that stimulated an issue of critical, negative reviews in the scholarly journal Human Biology, by the likes of Agustin Fuentes, Jennifer Raff, Charles Roseman, Laura Stein, and your humble narrator. It also stimulated a letter in the New York Times by nearly 150 geneticists repudiating Wade's invocation of their scientific field.  And they ought to know.

In fact, pretty much the only mainstream review of Nicholas Wade that was positive was the one in the Wall Street Journal, by Charles Murray.  So on this side, we have the biological anthropologists and human geneticists in accord that Wade's racist screed is a perversion of the relevant sciences, in which they are, for all intents and purposes, experts.  And on the other side, the political theorist  Charles Murray, who seems to wish that the "science" in Wade's book were true, regardless of what the data show and the experts think.  That's pretty anti-science.  It's just like the creationists, anti-vaxxers, and climate-change-deniers. What do they all have in common? They like to argue the science with the scientists.

It's like mansplaining, only less gendered.  Moronsplaining.

So Charles Murray is still out there, still sponsored by the right-wing think-tank called the American Enterprise Institute, and ever ready to publicly hawk a book of pseudoscience that the scientific community repudiates. Still ready to peddle his own antiquated ideologies about rich people being genetically smarter than poor people. And since social programs designed to assist the poor are doomed to failure because the poor are innately stupid, they should be abolished.

              To the extent that class and race are correlated in the US, Murray's ideas about the poor being genetically stupid make an easy transition into the world of scientific racism.  And it wasn't accidental.  The Bell Curve cited literature from The Mankind Quarterly, which no mainstream scholar cites, because it is an unscholarly racist journal, supported by the Pioneer Fund, that wacko right-wing philanthropy that has thrown money at wacko eugenicists, racists, segregationists, and hereditarians of all stripes, since its inception in 1937 under the aegis of the wacko eugenicist Harry Laughlin. The Bell Curve also cited the work of that racist wacko psychologist Philippe Rushton – who believed that the mean IQ of Africans is genetically set at 70, and that Africans had been r-selected for high reproductive rate and low intelligence – and then pre-emptively defended his wacko racist ideas in an appendix.  Even the wacko evolutionary psychologists distanced themselves from Rushton, appreciating the toxicity of his ideas: "Bad science and virulent racial prejudice drip like pus from nearly every page of this despicable book," wrote David Barash in the journal Animal Behaviour.

                But Charles Murray wasn't smart enough to see it.  He couldn't see the virulent racial prejudice in the work he was defending.  Or else he was blinded by his own prejudices.  It's age-old bind: ideologue or idiot?

                And now the alt-right has gained political ascendancy, and Charles Murray is on a speaking tour.  And he gets shouted down and driven off of Middlebury College.  But he gets invited to other colleges and his message is heard. 

He is invited to Notre Dame by a political science professor named Vincent Phillip Muñoz, and is civilly and effectively rebutted by Agustín Fuentes.

But let's back up a clause or two.  Who is inviting Charles Murray to speak at their college, and why?  At Middlebury, he was invited by Allison Stanger, a professor of international politics and economics, who told her story in the New York Times, as wanting to engage with his ideas. Likewise, Muñoz argues that "Murray makes an important argument that should be heard". Even the New York Times agrees he should say his piece.

                I'm going to disagree.  Charles Murray talks science that is bogus, and political philosophy that is evil, and uses one to justify the other.  He doesn't need to be heard by anybody, any more than a creationist, or a pedophile, or an anti-vaxxer deserves to be heard. 

                So this is what I find confusing. In the free marketplace of ideas in contemporary political science, we still entertain the scientific hypothesis that the poor deserve what little they have because they are genetically stupider than the rich? First of all, I don't know any geneticist who agrees to the the second clause.  A hundred years ago, geneticists believed that. Since the Great Depression, however (which democratized poverty), not too many geneticists have believed it.  (The late Henry Harpending did. That was probably an example of Planck's Principle.)

                Rather, nearly all contemporary geneticists seem to think that the old lefty J. B. S. Haldane more or less got it right when he said, "The average degree of resemblance between father and son is too small to justify the waste of human potentialities which an hereditary aristocratic system entails." Let me translate: You inherit a lot of stuff, and some of that stuff is genetic.  But a lot of the most important stuff – like, privilege – is not. And it is a big mistake to confuse the two categories. Consequently, if you are committed to the proposition that genetic properties are more important than everything else, that is a moral proposition not supported by genetics itself, you smug bastard.

                Class advantages are very real, but they aren't genetic. Doesn't everybody know that?

                I think it's kind of weird that political scientists would be willing to entertain ostensibly scientific ideas – in this case about human genetics – that the relevant scientists themselves do not take seriously.

                But Charles Murray isn't a geneticist.  He is a genetics fanboy. Imagine that you were a professional magician, with a three-year-old child trying to convince you, and everyone else around, that everything important in life is caused by magic.

                That said, however, don't think I'm going to let geneticists off the hook so easily. Sad to say, there are, and always have been, opportunistic geneticists who recognize the self-interest in telling the public that everything important in their lives is genetic. Over a century ago, there was Reginald C. Punnett, inventor of the eponymous Square, who ended the first English textbook on Mendelian genetics with the conclusion that "progress is question of breeding rather than of pedagogics; a matter of gametes, not training…. [T]he creature is not made, but born."  The American geneticist Charles Davenport jumped on the Mendelian bandwagon, and soon explained class differences just as Charles Murray does.  But rather than speak of cryptic factors, as Murray does, Davenport  isolated the cause of those class differences in the gene for feeblemindedness.  Rich white people from northern Europe had one allele; everybody else had another. But whether you speak of specific genes for feebleminded or cryptic genetic factors that cause the poor to be stupid, it's still fake science. 


               The Bell Curve capitalized on the popularity of the Human Genome Project in putting forth its thesis about the genetic stupidity of poor people in the 1990s.  Some geneticists repudiated it, but others recognized, as the geneticists of the 1920s did, that it was good for the business of genetics.  When Science reviewed Madison Grant's The Passing of the Great Race – a classic of American racist thought, which was read in defense of Karl Brandt at the Nuremberg trials to show that the Germans had simply been doing what the Americans were advocating – it concluded with a sobering thought: "This is a book that will … help to disseminate the ever-growing conviction among scientific men of the supreme importance of heredity." Sure, the genetic theory in question might be inane, might be evil, and it might be false, but it definitely is good for business. More recently, the Human Genome Project was backed up with all sorts of purple prose about how your DNA sequence was the most important thing about you: The Code of Codes, The Book of Man, and the like.  They knew it was bullshit then, and that's why there is such interest in epigenetics now


               These geneticists are reprehensible, because they provide the hereditarian soil for scientific racism.  The geneticists may not themselves be racists, but their idiotic statements about what they think their knowledge applies to have indeed sometimes crossed over.  James D. Watson, who knows more about DNA than you do, caused a stir a decade ago, when he said that different groups of people have different "powers of reason".  The rest of the genetics community disagreed, and challenged his own powers of reason.

                And here is the newest exhibit. A video from the famous mouse genetics lab in Bar Harbor, Maine.  It tells you about genetics and genomics, and how genetics controls things like your  eye color and good taste.

Wait, what? (It's at 0:15). Good taste is genetic?

Well she was a bit coy about it, wasn't she?  She delivered the line with a giggle, and the disclaimer, "maybe even good taste".

Geneticists know that good taste is not genetic, because good taste is context-dependent and locally-specific. Geneticists of the 1920s knew that it was in their short term interests to have the public believe that any and all shit was innate.  But the field evolved, and can't afford to devolve.

It would be nice if we could get beyond genetics-vs-culture, so we could talk more comprehensively about "embodiment".  But the hereditarians and racists won't allow it.

We should not be debating the innate intelligence of black people, or of the poor, on college campuses or anywhere.  It is a morally corrupt pseudoscientific proposition. 


It's like inviting a creationist or an inventor of a perpetual motion machine. The university should not be a censor, but it sure as hell is a gatekeeper.  At this point, sometimes they go all radical epistemological relativist and and say that all ideas deserve a hearing.  But all ideas don't deserve a hearing.  The universe of things that do get discussed and debated on college campuses is rather small in proportion to the ideas that people have debated over the years.  Should we stone witches? No. Might the speed of light be 140,000 miles per second, rather than 186,000? No.  Might the universe just be made up of earth, air, water, and fire? No.  Might Africans just be genetically stupid? Might people who want to debate this point have their fundamental civic morality called into question instead?





Let me give the last word, then, to Allison Stanger, who invited Charles Murray out to Middlebury College and got roughed up a bit, because she thinks that the innate intelligence of black people ought to be a debatable topic; which apparently ruined the pleasure she ordinarily derives from tormenting marginalized people. As she casually explained it in the New York Times:
I had tough questions on both the controversial "Bell Curve," in which he partly blames genetics for test score differences among races ... But the event had to be shut down, lest the ensuing dialogue inflict pain on the marginalized.




-----------------------------------------------

[Note:  Apparently Stanger herself did not invite Murray, but "welcomed the opportunity to moderate a talk with him on campus."  In any case, we still disagree on the central issue of whether the innate intellectual capacities of non-white people should be a subject open for debate on campuses in 2017.]



Sent from my iPad

Sunday, May 13, 2018

Why I Escaped the ‘Intellectual Dark Web’ - The Chronicle of Higher Education

Why I Escaped the 'Intellectual Dark Web' - The Chronicle of Higher Education

Why I Escaped the 'Intellectual Dark Web'

Pissing off progressives isn't intellectual progress

By Alice Dreger May 11, 2018
Matt Roth for The Chronicle Review

Conventional wisdom says that if a staff writer for The New York Times wants to feature you in a story about brave intellectuals, you reply, "Yes, please!" This is especially true if the Times sends a Pulitzer Prize-winning photographer to create a noble portrait of you for an accompanying visual pantheon.

But every time that Times writer, Bari Weiss, called to talk with me about the "Intellectual Dark Web" and my supposed membership in it, I just started laughing. In case you missed it — though, really, how could you, considering that it seems to be everywhere at the moment? — the Times recently published a piece about a bunch of renegade intellectuals who "dare venture into this 'There Be Dragons' territory on the intellectual map."

Why was I laughing? The idea that I was part of a cool group made me think there was at least some kind of major attribution error going on. The confused feeling was exacerbated by the dramatic photo setup: Damon Winter, the Pulitzer winner, had me standing in a marsh full of tall, dry reeds, waiting for a particular moment just past sunset. "Why am I in this scene?" I wondered as we waited, and not just because my favorite dress boots were getting muddy and I worried about ticks.

I also had no idea who half the people in this special network were. The few Intellectual Dark Web folks I had met I didn't know very well. How could I be part of a powerful intellectual alliance when I didn't even know these people?

When I asked what this group supposedly had in common, the answer seemed to be "they've climbed the ladder of fame by pissing people off, saying stuff you're not supposed to say." They regularly made progressives angry with "politically incorrect" statements about gender, race, genetics, and so on. This troubled me the most — that one might think of pissing people off as an inherent good, a worthy end.

Opinion is not scholarship, it is not journalism, and we are dying for lack of honest, fact-based, slow inquiry.
While I am very experienced at being annoying, including to members of my own progressive tribes, I don't think this is a technique that should on its own be valorized. Pissing people off is something to be done accidentally, as a side effect, while you're trying to fix a significant problem. Yet the operating assumption behind the Intellectual Dark Web seems to be that angering progressives represents a mark of honor in itself. Indeed, the group's signature hack is leveraging these alleged badges of honor into greater fame and fortune. (Witness the singular genius of Jordan Peterson.)

I knew that some of the people named as part of the IDW are, like me, legitimate scholars — they care about research, data, and changing their own minds through honest inquiry. But that just made me wonder why these enlightened souls would want to be glorified as part of a "dark web." Perhaps they were in universities just long enough to get the pernicious message from their central administrations that all publicity is good scholarship (until it is cause for firing)?

The Times article confirmed my initial fears — and made me glad that I asked to be left out (which I was). The article begins by breathlessly reporting that the IDW is rife with "beauty" and "danger." So, what even is it? Here's the vague rundown: "Most simply, it is a collection of iconoclastic thinkers, academic renegades and media personalities who are having a rolling conversation — on podcasts, YouTube and Twitter, and in sold-out auditoriums — that sound unlike anything else happening, at least publicly, in the culture right now."

Meh. How is this really about intellectualism, darkness, or a special web? If these people are having conversations that are so rare "in the culture," how is it that they have millions of followers and pack auditoriums? (Is "the culture" The New York Times?)

The whole thing — especially the excitement over these people having found a "profitable market" — made me identify anew with that person standing in the ESPN-televised crowd at some SEC football game holding the sign that said, "You people are blocking the library." I don't see it as a sign of intellectual progress when a bunch of smart people find a way to make money off of niche political audiences by spewing opinions without doing much new research.

Opinion is not scholarship, it is not journalism, and we are dying for lack of honest, fact-based, slow inquiry. Twenty years since my first scholarship-based op-ed ran in The New York Times, here's what I see: a postapocalyptic, postmodern media landscape where thoughtfulness and nonpartisan inquiry go to die. The Intellectual Dark Web isn't a solution, it might just be a sign of end times.

I'm all for bringing intellectualism to the masses, but like a lot of academics, I value ambivalence itself, along with intellectual humility. Yet these values seem in direct opposition to the kind of cocksure strutting that is the favored dance move of the IDW.

What I'm left with after this experience is a sense, for myself, of how much academe matters. How we need to fight back against university administrators' equation of "entrepreneurship," funding, and publicity with scholarship. How, since resigning my position at Northwestern University over my dean's censorship of my work, I miss the Intellectual Light Web, the crisscross of walking paths that bisect the campus green. How we need job security to keep people from going to the dark side.

Professors, listen to me: You don't want to be in this dark-web thing, even if it comes with an awesome trading-card photo. You are in the right place. Carry on.

Alice Dreger is a historian of medicine and science and the author of Galileo's Middle Finger (Penguin, 2015).



Sent from my iPad

Wednesday, May 9, 2018

I’d whisper to my student self: you are not alone

I'd whisper to my student self: you are not alone

I'd whisper to my student self: you are not alone

Twenty years on, Dave Reay speaks out about the depression that almost sunk his PhD, and the lifelines that saved him.

COMMENT

Dave Reay

Artistic illustration of a person at the prow of a ship gazing out across the sea at the sunset

Illustration by Neil Webb @ Debut Art

A black dog. That's how Winston Churchill described depression. A dark companion that lies quiet in the good times and is your master in the bad. Its arrival is hard to predict, just as its departure seems impossible until it has happened. Depression is as varied as the people who experience it, as shifting in form as a murmuration. Yet its flavour is unmistakable.

It is twenty years since my own black dog stalked away. Twenty years since debilitating depression saturated my every thought and act. Survivors are forever wary that the dog will return. That it will overcome the new defences, leap the redoubts. It's always out there somewhere, skulking; reminding us to watch our step.

Mental illness affects around a quarter of us at some point in our lives1, with more than 300 million people battling depression worldwide2 and academics at greater risk than many3. The pressure to publish, make an impact, win funding, chase tenure, engage the public, shine on social media and influence policy, combined with failing experiments, lone working and rigid hierarchies, are all threats to our well-being. Yet our job is a thing of beauty, too. To explore new ideas, to grapple with the unknown, and ultimately to understand: that is a privilege.

"The saving grace was really my colleagues around me and they didn't even know it"

As scientists, we talk, a lot. To our students and peers, to policymakers and funders. We become experts in telling the world about our work and opinions, but we risk not truly hearing ourselves or those around us.

My own two decades of silence were borne of fear. Fear of ridicule and mistrust, of overt hostility and covert isolation. Talking to friends, sharing memories of the dark times, has tempered these fears. Although speaking openly about my illness still makes me anxious, it is nothing to the waking nightmare lived by my student self.

If I could speak to him now, I'd tell him not to be afraid, to trust in people. I'd tell him how wonderful the warmth of those he spoke to will feel, how sharing the pain of his mental illness would diminish its effects, not him. I would hug him and whisper: "You are not alone."

I tell this story for anyone who is silenced by the stigma of depression. I tell it to thank the wonderful people who gave me a voice, and then listened. I tell this story because I can, because I survived.

Early one year in the mid-1990s my undergraduate road was nearing its end. Three years of marine biology, substance abuse, mild poverty and falling in love. By October, I was a doctoral student 500 kilometres away from family and the love of my life.

It was natural to be scared. New people, new place, unknown rules and expectations. On the tall stool at my carefully demarcated square of lab bench I sat each day, trying to fathom the instructions of my supervisor. The research was on impacts of warming in the Southern Ocean — growing algae in the lab for starters, maybe a trip south later on. It was a masquerade from the start. The techniques and equipment were alien, my nodding response to instruction, an act. I tried to mimic the easy confidence of the other students, to fit in. My supervisor and his group must have assumed I had some idea of what to do. I hadn't. Not a clue.

Batch after batch of my growing medium went bad. Each morning, returning to the lab from my bedsit, the large glass bottles would have the unmistakable milkiness of contamination. Hours of weighing, mixing and sterilization would follow as I flushed with embarrassment and tried to suppress a mounting fear. Fear of being reprimanded. Fear that I was just too stupid. Fear that everyone would find out.

The days merged into weeks; the failures continued. My supervisor called me to his office. He was unimpressed. I needed to do better, work harder, get it right. Walking back to my bench that day, a black dog walked with me. When exactly it had arrived I can't say. I was 21. My ramshackle mental defences had been crumbling for a while. I'll never know what tore the breach. It may have been an admonishment from a lab technician or just one more cloudy culture bottle. Whatever had splintered the final defence, depression was my new master.

It is amazing how we can maintain a facade of normality while behind it a maelstrom of disintegrating sanity roars. By Christmas, I was thinner and quieter, but still me. Time with family and fiancée kept the black dog subdued. It waited.

Instead of seeking help, I counted down the remaining days of holiday like a condemned man. Sometimes you can see deep depression coming for you, slamming doors of escape, tightening the orbits of desperation. Deciding I would kill myself was a relief. I wrongly felt that it was the only door left open. One place the black dog could not follow.

Saying goodbye to my family when I returned to university that January was brutal. My mask could not withstand the warmth of those people I so loved. Through the sobbing, fear drove me on, piloting my body through the long journey back to my term-time lodgings.

Alone in the house I ran a hot bath, drank whisky without tasting it and selected the sharpest kitchen knife. I sat in that bath for an age. The knife hurt on my flesh, whisky or no. I wept and retched and wept some more. Too afraid to cut deep, anxious about what people would think, about the pain. I drained the bath and went to bed.

The months that followed were a fluctuating grey-scale of near-normality and despair. Depression is exhausting. Even eating and washing felt irrelevant to my own contracting existence. Some days were doable, others I spent lying balled and wide-eyed, waiting for the night. When it arrived, sleep delivered only confused marathons of lost time.

My second attempt at suicide came just before Easter and was longer in the planning, the means gleaned from the Dick Francis mystery Comeback and obtained from the lab's chemical store. An injection, some nausea, a short blackout, that's all.

Throughout this period I self-medicated with alcohol, and cannabis when it was available; anything for oblivion. I didn't seek help, therapy, or prescription drugs, nor did I quit my PhD. I survived long enough for help to find me, eventually.

By the summer, life settled. Research, relationships, even my supervisor: all the quakes that had brought down the walls of my mind were still. I laid plans for work on an icebreaker in the South Atlantic that coming winter. It was exciting, an opportunity I had dreamed of, and yet still one in which I pictured my own destruction: deep inside, thoughts of suicide lingered, a secret obsession. A vision of slipping unseen into a remote sea, all a terrible accident, a tragic loss.

It was people who saved me. They still do. A warm word from a friend, a joke and a moan with a colleague. Day by day, piece by piece, the people around me unknowingly brought my mind back to health. One man in particular — a beautiful human and can-do lab technician named Paul Beckwith — helped me to trust in others, to share and to belong.

In December, just over a year after the start of my PhD, I joined the research ship near the Falkland Islands and my convalescence accelerated. I sipped daily on a tonic of big-hearted, inspirational and funny scientists. Being stuck together in floating isolation for months could have been torment. Instead it was my salvation.

The cloudy jars and experimental despair were replaced by thriving baths of algae, and geeky thrills as the microbes responded to artificially induced future climates. Even through the darkest days, gripping my lab bench to suppress the convulsions within, I never hated the science. Now that I could finally understand how to do it, I loved it.

The savage beauty of the South Atlantic, its sparkling wildlife and scientists, will always be a touchstone for me. A month after joining the ship and at the end of a long night shift, I stood alone on its rolling deck. Dawn broke, and tears streamed down my face. Tears of joy and release. The black dog was gone.

The two decades since have sewn their own patchwork of light and dark. Deaths of those close to me, including the beautiful Paul, have called my black dog close again at times. But, for me, it is kept at bay by lucky brain chemistry, and the warmth and understanding of my wife (that love of my life), and of our children and friends. Not everyone is so fortunate.

As the supervisor of dozens of students and staff, empathy is useful, listening is vital. Just a chat, making time for people; it can mean the world. We are mentors, not therapists, but our university communities are under enormous stress. Insecure jobs and mounting debt, endless metrics and poor management — all are risks to our mental health in the edifice of corporatization that our higher-education system has become.

Awareness is growing, with supervisor training, peer-support networks and counselling services now commonplace. More importantly, the stigma of mental health issues is fading. Whether it's to a friend or loved one, a therapist or colleague, those afflicted by mental illness must feel secure in speaking out. Silence was for too long the enemy of my own healing. Next time that black dog comes close, it will be met with a roar.

Nature 557, 160-161 (2018)

doi: 10.1038/d41586-018-05080-6

Sign up for the daily Nature Briefing email newsletter

Stay up to date with what matters in science and why, handpicked from Nature and other publications worldwide.

Sign Up

References

  1. 1.

    Kessler, R. C. et al. Epidemiol. Psichiatr. Soc. 18, 23–33 (2009).

  2. 2.

    World Health Organization Depression and Other Common Mental Disorders: Global Health Estimates (WHO, 2017); available at http://go.nature.com/2ksfnhg

  3. 3.

    Royal Society & Wellcome Trust Understanding Mental Health in the Research Environment: A Rapid Evidence Assessment (Royal Society, Wellcome Trust, 2017); available at http://go.nature.com/2p8fq8r

Download references



Sent from my iPhone

Tuesday, May 8, 2018

How to Build Anything Out of Aluminum Extrusion and 3D Printed Brackets | Hackaday

How to Build Anything Out of Aluminum Extrusion and 3D Printed Brackets | Hackaday

How to Build Anything Out of Aluminum Extrusion and 3D Printed Brackets

The real power of 3D printing is in infinite customization of parts. This becomes especially powerful when you combine 3D printing with existing materials. I have been developing a few simple tricks to make generic fasteners and printed connectors a perfect match for aluminum extrusion, via a novel twist or two on top of techniques you may already know.

Work long enough with 3D printers, and our ideas inevitably grow beyond our print volume. Depending on the nature of the project, it may be possible to divide into pieces then glue them together. But usually a larger project also places higher structural demands ill-suited to plastic.

Those of us lucky enough to have nice workshops can turn to woodworking, welding, or metal machining for larger projects. Whether you have that option or not, aluminum extrusion beams provide the structure we need to go bigger and to do it quickly. And as an added bonus, 3D printing can make using aluminum extrusion easier and cheaper.

Everything is Built from Aluminum Extrusion

Aluminum extrusion beams are no stranger to these pages. We have a general overview of them and we have seen so many projects using extrusions, like satellite trackers, Rubik's cube solvers, and automated drone charging stations. Many popular 3D printers have frames of aluminum extrusion as they offer higher strength and superior dimension tolerances than 3D printed plastic.

But creativity is quickly stifled by connector selection, as the vast majority of connectors in the catalog are for 90-degree joints of one flavor or another. Angled connectors are in the minority, and usually limited to a few angles like 30, 45, and 60 degrees. There isn't enough sales volume to justify making connectors with angles that only a few people would ever use. This is where our 3D printer re-enters the picture, as a factory for low volume niche parts.

Custom Doesn't Mean Hard

Combining custom with stock allows us to design projects leveraging the strengths of each part: the aluminum extrusion provides generic structure, with the 3D printed plastic linking them together in a project-specific way. We've covered 3D-printing right angle connectors before (or in conjunction with zip-ties) but today we're focused on 3D printing's advantage for very precisely building arbitrary angles.

The CAD to design for these connectors is pretty simple. It is typically a matter of subtracting out a rectangular solid for the beam itself, followed by subtracting a few cylinders to create mounting holes for fasteners. Bolting to an extrusion on three sides (like the rocker arm example here) is usually strong enough for 3D printed plastic projects. This means we can often skip the "top" (relative to print bed) side for easier printing. Sometimes we'll want that strength badly enough to deal with bridging or some other technique to give us that fourth side, but we'll leave that challenge aside for now. The point is, that you can give this a try with minimum effort and adopt more of the technique as you get used to it.

An Example: Hard Angles Made Easy

I've been working on a design for a rocker arm sub-assembly and it makes a perfect example of this discussion. It is part of the suspension system for a robot I'm building. The entire design is far too large to print as a single piece, so I divided the object into aluminum segments linked together by 3D printed parts. Here are the wireframe diagrams:

Three-view of rocker arm with two aluminum extrusion beams adjacent to it.

Look at the relative angles of these two extrusion segments — no extrusion vendor would stock connectors at these angles and if they did you still wouldn't get a pair of bearings at its center and a mechanical linkage up top. Such project-specific details make this hub ideal for one-off 3D printing. As for the arms, we could spin up a 3D print job for two long rectangular blocks of plastic. But why? The only project-specific detail here is length, which can be cut far faster than an identical length of plastic can be printed, even if the cutting was done by a hand-held saw.

This article focuses on where the aluminum meets the plastic so I'm not going to go deep into the details on setting up your angle or removing the area for the extrusion. But it's worth noting a few general 3D printing behaviors that strongly impact this construction technique. Select your layer orientation wisely — when printed parts are pushed beyond their limits, they generally fail by fracturing along layer boundaries. Use at least two fasteners along each axis, spaced approximately as far apart as beam width, to spread out workload. Using just one fastener turns that single point into a fulcrum multiplying forces on our printed parts, and we don't want that.

One more point to consider is overhang when bolt holes do not align nicely with the print bed. We can help our printer bridge odd bolt head surfaces by creating a thin layer (2-3 print layer heights thick) covering the bolt hole which can be cleaned up with a drill after serving its purpose. These are easier to remove and consume less material than the typical solution of printing supports.

The Secret Sauce: A Replacement for T-slot insert, T-nut, T-whatever

Once we have a project-customized connector designed and printed, we proceed to assembly where we face the other hidden gotcha of aluminum extrusions: specialty fasteners. Called T-slot inserts, T-nuts, or some similar name, they are shaped specifically to fit in the slot of an aluminum extrusion beam. They are also extremely expensive. Not necessarily in an absolute sense, but certainly relative to commodity hex nuts of similar size. Our rocker arm example is designed for 15mm Misumi 3 Series extrusions. Misumi offers several specialty nuts for this extrusion, the least expensive "economy" model HNSQ3 costs $8.28 for a bag of 100. In contrast, our trusty hardware supplier McMaster-Carr offers generic M3 hex nuts at $0.88 for a bag of 100. (Catalog #90592A085)

15mm Misumi 3 Series (right) is friendlier to using generic M3 hex nut (shown) than its 20mm sibling 5 Series (left) shown with matching specialty fastener.

Before we give in to the temptation of a 90% discount, let's do a quick review of what we trade off by using generic nuts. The first and most obvious barrier is that a particular extrusion beam's profile might not allow a generic nut to be used. Second, a generic nut will not engage slot interior surfaces to the same degree as a nut tailored to the extrusion profile. This usually means we can't torque the bolt down as tightly before something slips. Third, higher end specialty nuts provide friction against the extrusion, either by its shape, size, or an additional thin metal spring that holds the nut in place. Nuts that stay at position instead of sliding around in the slot eases assembly, translating into less time wasted and lower frustration for the assembler. Such labor savings can outweigh a specialty fastener's cost.

For this rocker arm example, the first concern does not apply as Misumi 3 Series is tolerant of generic M3 nuts. Item number two — reduced fastener torque limit — is acceptable as long as it is still sufficient for what we need on a 3D printed plastic project. And item three — assembly convenience — is something we can get back with some clever 3D printing.

We now apply 3D printing's strength (a factory for low volume, task specific objects) to generic fasteners in aluminum extrusion beams. Our technique employs a small tool designed specifically for the job. For starters, this tool has a little hump to act as a spring that helps hold its position in the slot. Then, because the tool is designed alongside the part it will be used with, we can design it with trays to hold generic nuts spaced exactly at the distance we need. And as an extra icing on the cake, we add a little hook to the end. This hook — designed to be exposed when hub is in place on the arm — allow us to fine-tune our position without having to take things apart.

To fit within the narrow spaces left in the slot, the tool is printed as thin as possible: the width of our 3D printer's nozzle. This also makes economical use of filament and it's very fast to print, both attributes useful for a disposable item since this thin positioning tool can't be removed once the bolts are tightened down. It is well worth the sacrifice, though, because it turns a frustrating exercise of aligning small fasteners into a trivially simple task.

Give this fastener method a try. I hope you find the technique useful and I look forward to seeing more projects that combine the strengths of 3D printing and aluminum extrusion beams into something that neither could do alone. If you are proud of your result, don't be shy about letting us know via our tips line or maybe as an entry to our Hackday Prize. Happy building!



Sen

Thursday, May 3, 2018

What 'A Nation At Risk' Got Wrong, And Right, About U.S. Schools | WAMU

What 'A Nation At Risk' Got Wrong, And Right, About U.S. Schools | WAMU

What 'A Nation At Risk' Got Wrong, And Right, About U.S. Schools | WAMU

Very few government reports have had the staying power of "A Nation At Risk," which appeared 35 years ago this month and stoked widespread concerns about the quality of American schools.

"The educational foundations of our society are presently being eroded by a rising tide of mediocrity that threatens our very future as a nation and as a people," the authors thundered in one of its best-known passages.

When it appeared in April 1983, the report received widespread coverage on radio and TV. President Reagan joined the co-authors in a series of public hearings around the country.

The report's narrative of failing schools — students being out-competed internationally and declining educational standards — persists, and has become an entrenched part of the debate over education in the U.S.

Prudence Carter, the dean of the Graduate School of Education at the University of California, Berkeley, teaches her students that "A Nation At Risk" was a "pivotal moment" in education policy — the beginning of a "moment of angst" about the state of the nation's schools.

That angst found expression, she says, in the No Child Left Behind law in 2002 and the Race to the Top initiative in 2009 and is still enshrined in federal law today.

Although there has been some progress, "the reason that we continue to mark the anniversary is that [the worry] still rings true," says Michael Petrilli, president of the Thomas B. Fordham Institute. He calls the report "a touchstone"; it's in the mission statement of the institute, which promotes school choice, testing and accountability.

This month, U.S. Education Secretary Betsy DeVos invoked the report's anniversary in remarks to the Reagan Institute Summit on Education, convened for the occasion. "Our nation is still at risk," she concluded.

But what I learned in talking to two of the original authors of "A Nation At Risk" was that they never set out to undertake an objective inquiry into the state of the nation's schools.

They started out already alarmed by what they believed was a decline in education, and looked for facts to fit that narrative.

And while their report is still widely cited, a second official federal government analysis of standardized test scores, produced just seven years later, showed the opposite of what was claimed in "A Nation At Risk." That analysis found, instead, "steady or slightly improving trends" in student achievement.

The looming disaster depicted in "A Nation At Risk," it turns out, was a matter of interpretation.

I interviewed Yvonne Larsen, the vice chair of the commission that wrote the report, for my 2015 book The Test. Here's how she described what happened:

"I was called by [President Reagan's] office. They told us that we were going to have a commission … to address the challenge that we faced in trying to upgrade America's education to the rigorous education that we had in the past … We felt the rigor in our schools had diminished. We were concerned. There was a strong feeling that if we continued how we were going, we wouldn't continue to improve."

Gerald Holton, now professor emeritus of physics and the history of science at Harvard University, was another member of the commission. He drafted some of the most alarmist language in the document, including the now-famous line: "If an unfriendly foreign power had attempted to impose on America the mediocre educational performance that exists today, we might well have viewed it as an act of war."

Like Larsen, he said that he and his co-authors set out to confirm their existing concerns about the state of America's schools.

Back then, "education was not on the front page," he recalls. "It was more inflation and gas lines." But he and his colleagues "knew that trouble was ahead. We knew something had to be done."

In making the case that trouble was ahead, the authors used language that Bruno Manno calls "apocalyptic, almost militaristic" — and he's an admirer of the report. Manno worked in the Department of Education starting in 1986 and now serves as a senior adviser to the Walton Family Foundation's K-12 education initiative. (Walton supports NPR's coverage, including of education.)

"That was done on purpose to capture the attention of the American public," he says.

"A Nation at Risk" cited statistics such as: "The average achievement of high school students on most standardized tests is now lower than 26 years ago when Sputnik was launched," and "[The SAT demonstrates] a virtually unbroken decline from 1963 to 1980. Average verbal scores fell over 50 points and average mathematics scores dropped nearly 40 points."

Those numbers weren't made up. But they weren't the only ones out there.

The report de-emphasized the fact that more students than ever were graduating from high school and attending college, and that top U.S. students led the world in academic achievement.

The Department of Energy — yes, Energy — commissioned a follow-up analysis of test score trends in 1990. It was known as the Sandia Report, after the federally funded Sandia National Laboratories which produced it.

Its authors were engineers trying to generate economic forecasts, not education authorities with an ax to grind. And they didn't diagnose the same disaster that "A Nation At Risk" did.

"To our surprise, on nearly every measure, we found steady or slightly improving trends," one of the authors, Robert Huelskamp, later wrote.

How could this be? Because of a statistical effect known as Simpson's Paradox.

In the early 1960s, college-going was still rare. It was mostly top students, largely well-off white males, who took standardized tests like the SAT and applied to college.

By the 1980s, college was more available to more people, and more important to getting a good job. Many more people were taking the SATs and applying to colleges. This included more people of color, more low-income students and other historically disadvantaged groups.

So, when you lumped everyone's scores together, as "A Nation At Risk" did, you saw declining average scores from the 1960s to the 1980s.

But, when you broke out test takers by subgroup, as the Sandia Report did, looking at men, women, whites, Hispanics, African–Americans and low-income students separately, you found that most of these groups of students were improving slightly on test-taking over that time.

"The idea that American schools were worse just wasn't true," says James Guthrie, an education professor at Lynn University in Florida. Guthrie published a scholarly article in 2004 titled "A Nation At Risk Revisited: Did 'Wrong' Reasoning Result in 'Right' Results? At What Cost?"

"I looked at it every which way," he says now. The authors in 1983 "were hell-bent on proving that schools were bad. They cooked the books to get what they wanted."

Holton objects to this view. "We put our honor and our lives into this report, and we were not being taken for a ride by anybody," he says. "These were serious people ranging from a Nobel Prize winner [Glenn T. Seaborg] to the head of Bell Telephone Labs [William O. Baker]."

"A Nation At Risk" got the national spotlight.

The Sandia Report got something very different. Its publication was delayed for many months. It's been cited as a famous case of censorship.

Diane Ravitch, then a Department of Education official under President George H. W. Bush, wrote an op-ed critical of the Sandia Report headlined "U.S. Schools: The Bad News Is Right."

Ravitch later publicly renounced this position and others, and became a bestselling author and advocate focused on educational equity. When "A Nation At Risk" came out, "I thought, oh boy, this is going to shake everybody up. It's a good thing," she tells NPR.

"Now, I think it sounded an alarm that was misguided, because the schools were not sunk in mediocrity."

That alarm, and the message — "the bad news is right" — has been repeated countless times in the decades since, by philanthropists, business leaders, politicians and other reformers. Ravitch notes that schools may be blamed when times are tough but not necessarily credited when things are going better.

"A Nation at Risk," she says, was "written at a moment when we were in recession. When our economy was booming, nobody said, 'Gosh, we must have really great schools.' "

The habit of criticism of student test scores persists. For example, earlier this month, the Nation's Report Card came out, showing steady scores in most areas and improvement since 2015 in one area out of four tested, eighth grade reading. Headlines called the results "disappointing".

If you look at NAEP trends over the long term, 9- and 13-year-olds scored modestly higher in reading and mathematics in 2012 than they did in the mid-1980s.

When it comes to the SAT, meanwhile, both the overall number and the racial and socioeconomic diversity of people taking the SAT continue to rise. And the test has changed over time. But according to the College Board, which reports the results in a consistent format, scores have continued fairly steady.

  • In 1983, the averages in reading were 508 for boys and 498 for girls. In math: 516 for boys, 474 for girls.
  • In 2016, reading scores for boys averaged 495 for boys and 493 for girls. In math: 524 for boys, 494 for girls in math.

Meanwhile, in more than half of states, public schools are receiving less total state funding than they were a decade ago.

In all but a few states, teachers earn less than what other professionals with a similar level of education are making. Just 11 states direct more money to districts full of impoverished students than to affluent districts in consideration of their greater needs, a figure that has declined by half since the Great Recession.

And, just over 50 percent of public school students are now eligible for free and reduced-price lunch because of their family income.

In the context of declining resources and rising child poverty, maintaining steady or slightly improving test scores over decades could be described with other words besides "flat" and "disappointing" — perhaps "surprising" or "heroic."

But the narrative established by "A Nation At Risk" still seems to be the one that dominates how we think of the data.

Guthrie, for one, thinks that's been, on balance, a good thing, because it brought education to the front and center of the U.S. agenda.

"My view of it in retrospect," he says, "is seldom, maybe never, has a public report been so wrong and done so much good."

Copyright 2018 NPR. To see more, visit http://www.npr.org/.


Sent from my iPhone