blackscientistsandinventors: Black Scientists in the…





blackscientistsandinventors:

Black Scientists in the Movies


Octavia Spencer plays Mathematician Dorothy Vaughan in the film Hidden Figures, that comes out Jan 17 2016. Who was Dorothy Vaughan?

The Human Computer Project: Dorothy Vaughan

Full Name:Dorothy Johnson Vaughan

Birthdate:September 20, 1910

Birthplace:Kansas City, MO

Education:BA, Mathematics, Wilberforce University 1929

Center:Langley Research Center

Work Dates:1943 - 1971

Position(s):Computer; Section Head, West Area Computers; Mathematician, ACD

Group(s):West Computers; ACD Specialties:Flight paths; Scout Project; FORTRAN programming 

Dorothy Vaughan came to the Langley Memorial Aeronautical Laboratory in 1943, during the height of World War II, leaving her position as the math teacher at Robert Russa Moton High School in Farmville, VA to take what she believed would be a temporary war job. Two years after President Roosevelt signed Executive Order 8802 into law, prohibiting racial, religious and ethnic discrimination in the country’s defense industry, the laboratory began hiring black women to meet the skyrocketing demand for processing aeronautical research data. Urgency and twenty-four hour shifts prevailed– as did Jim Crow laws which required newly-hired “colored” mathematicians to work separately from their white female counterparts. Dorothy Vaughan was assigned to the segregated “West Area Computing” unit, an all-black group of female mathematicians, who were originally required to use separate dining and bathroom facilities. Over time, both individually and as a group, the West Computers distinguished themselves with contributions to virtually every area of research at Langley. MORE

Learning to JAM in 5 steps: New initiative reminds journalism students to archive their digital work | RJI

Learning to JAM in 5 steps: New initiative reminds journalism students to archive their digital work | RJI:

muspeccoll:

If you’re a human with a computer, this is information you need to know! Journalist or not, we all have digital work that we care about saving for the future.  Learn personal digital archiving with our colleagues at the Journalism Digital News Archive.

Facebook confronts the grisly reality of live video with the shooting of Philando Castile

Facebook confronts the grisly reality of live video with the shooting of Philando Castile:

Facebook is facing new questions about its policies for live video after footage showing the aftermath of a police shooting in Minnesota was briefly removed….

Indian sexual assault survivors break taboos by using Snapchat filters to tell their stories

Indian sexual assault survivors break taboos by using Snapchat filters to tell their stories:

Instead of a traditional interview format, the two survivors spoke directly to the camera through a selfie stick. They also chose their own filters to disguise their identity as they spoke of their harrowing experiences. Both opted for a fire-breathing dragon filter. Omar says that this helped the women feel more empowered as the masking and video recording unfolded in front of their eyes.

Can Computers Be Racist? Big Data, Inequality, and Discrimination

Can Computers Be Racist? Big Data, Inequality, and Discrimination:

digital-femme:

The problem with big data is that its application and use is not impartial or unbiased. Harvard professor Latanya Sweeney, who also directs the university’s Data Privacy Lab, conducted a cross-country study of 120,000 Internet search ads and found repeated incidence of racial bias. Specifically, her study looked at Google adword buys made by companies that provide criminal background checks. At the time, the results of the study showed that when a search was performed on a name that was “racially associated” with the black community, the results were much more likely to be accompanied by an ad suggesting that the person had a criminal record—regardless of whether or not they did (see video below). This is just one of many research studies showing similar bias.

If an employer searched the name of a prospective hire, only to be confronted with ads suggesting that the person had a prior arrest, you can imagine how that could affect the applicant’s career prospects.

Can Computers Be Racist? Big Data, Inequality, and Discrimination

Can Computers Be Racist? Big Data, Inequality, and Discrimination:

digital-femme:

The problem with big data is that its application and use is not impartial or unbiased. Harvard professor Latanya Sweeney, who also directs the university’s Data Privacy Lab, conducted a cross-country study of 120,000 Internet search ads and found repeated incidence of racial bias. Specifically, her study looked at Google adword buys made by companies that provide criminal background checks. At the time, the results of the study showed that when a search was performed on a name that was “racially associated” with the black community, the results were much more likely to be accompanied by an ad suggesting that the person had a criminal record—regardless of whether or not they did (see video below). This is just one of many research studies showing similar bias.

If an employer searched the name of a prospective hire, only to be confronted with ads suggesting that the person had a prior arrest, you can imagine how that could affect the applicant’s career prospects.

Black to the future: afrofuturism and tech power

femmeinist-killjoy:

by Florence Okoye, 25th August 2015. Published on Open Democracy.

Social networks are another example of technologies used to promote liberation and spread consciousness about contemporary social issues.

We joke about ‘black twitter’ and ‘black tumblr’ but the reality is that these multinational, multiethnic and intercontinental networks have produced a new conscious black identity, an example of what Moya Bailey, founder of ‘Quirky Black Girls’ and member of the Octavia E. Butler Legacy Network referred to as “digital alchemy”. She describes this as the way “everyday digital media is transformed into valuable social justice media magic”. Though it is fraught with its own internal antagonisms, this network enables visible, self-organised political identities. This has galvanised many of us to unite across the world for the cause of social justice.

Of course, technology alone does not create utopias, neither is it neutral. Because of this, it is crucial to encourage engagement and ensure open access. Organisations such as Free Code Camp and Codebar.io recognize this. They provide free courses and the opportunity for mentoring to those less well represented in tech. While the big players in science and technology may always dominate, at least others can disrupt, embellish and beautify where they cannot.

Both in my own experience and that of many in the black diaspora, technology has played a crucial part in self empowerment. Through programming, many of us are are sharpening the skills to create technology to shape the world we live in for ourselves, rather than relying on handouts from the likes of Apple and Microsoft.’

This article would have been a perfect reference for my dissertation, and something I wish I’d had enough space to explore more. Love how I’m still thinking about my dissertation and reading around it, such a nerd, I really hope that I’ll have saved up enough money in a few years to do a masters *fingers crossed*. This also made me think of Candice who just got into a free coding course at LSE ‘cos she’s so smart and great *sighs*.
Read the full article here.