Talking in school

Learning by teaching

A few months ago, I was asked by a teacher at a local school to come in and talk to year 10 and year 11 students (aged 14-16 or so) about my job, what I do, my background, how I got into my job and to give any further thoughts and advice.  Today I got the chance to go in and talk to them.

I very much enjoyed myself[1], and hopefully it was interesting for the pupils as well.  I went over my past – from being “a bit of a geek at school” through to some of the stuff I need to know to do my job now – and also talked about different types of work within IT security.  I was at pains to point out that you don’t need to be a great mathematician or even a great coder to get a career in IT security, and talked a lot about the importance of systems – which absolutely includes people.

What went down best – as is the case with pretty much any crowd – was stories.  “War stories”, as they’re sometimes called, about what situations you’ve come across, how you dealt with them, how other people reacted, and the lessons you’ve learned from them, give an immediacy and relevance that just can’t be beaten.  I was careful not to make them very technical – and one about a member of staff who had lost weight while on holiday and got stuck in a two-door man-trap (which included a weight sensor) went down particularly well[3].

The other thing that was useful – and which isn’t always going to work in a C-level meeting, for instance – was some exercises. Codes and ciphers are always interesting, so I started with a ROT13, then a Caesar cipher, then a simple key, then a basic alphabet substitution.  We talked about letter frequency, repeated words, context and letter groupings, and the older group solved all of the puzzles, which was excellent.

There was time for some questions, too, which included:

  • “how much do you get paid?”  Somewhat cheeky, this one, but I answered by giving them a salary range for a job which someone had contacted me about, but which I’d not followed up on – and gave no indications of the reasons for rejecting it
  • “do you need an IT or computing degree?”  No, though it can be helpful.
  • “do you need a degree at all?”  No, and though it can be difficult to get on without one, there are some very good apprentice schemes out there.

I went into the school to try to help others learn, but it was a very useful experience for me, too.  Although all of the pupils there are taking a computing class by choice, not all of them were obviously engaged.  But that didn’t mean that they weren’t paying attention: one of the pupils with the least “interested” body language was the fastest at answering some of the questions.  Some of the pupils there had similar levels of understanding around IT security to some C-levels who aren’t in IT.  Thinking about pace, about involving members of the audience who weren’t necessarily paying attention – all of these were really useful things for me to reflect on.

So – if you get the chance[4] – consider contacting a local school or college and seeing if they’d like someone to talk to them about what you do.  Making it interesting, be ready to move on where topics aren’t getting the engagement you’d hope, and be ready for some questions.  I can pretty much guarantee that you’ll learn something.


1 – one of my daughters, who attends the school, gave me very strict instructions about not talking to her, her friends or anyone she knew[2].

2 – (which I have every intention of ignoring, but sadly, I didn’t see her or any of her friends that I recognised.  Maybe next time.)

3 – though possibly not with the senior manager who had to come out on a Sunday to rescue him and reset the system.

4 – and you’re willing to engage a tough audience.

Moving to DevOps, what’s most important? 

Technology, process or culture? (Clue: it’s not the first two)

You’ve been appointed the DevOps champion in your organisation: congratulations.  So, what’s the most important issue that you need to address?

It’s the technology – tools and the toolchain – right?  Everybody knows that unless you get the right tools for the job, you’re never going to make things work.  You need integration with your existing stack – though whether you go with tight or loose integration will be an interesting question – a support plan (vendor, 3rd party or internal), and a bug-tracking system to go with your source code management system.  And that’s just the start.

No!  Don’t be ridiculous: it’s clearly the process that’s most important.  If the team doesn’t agree on how stand-ups are run, who participates, the frequency and length of the meetings, and how many people are required for a quorum, then you’ll never be able institute a consistent, repeatable working pattern.

In fact, although both the technology and the process are important, there’s a third component which is equally important, but typically even harder to get right: culture.  Yup, it’s that touch-feely thing that we techies tend to struggle with[1].

Culture

I was visiting a medium-sized government institution a few months ago (not in the UK, as it happens), and we arrived a little early to meet the CEO and CTO.  We were ushered into the CEO’s office and waited for a while as the two of them finished participating in the daily stand-up.  They apologised for being a minute or two late, but far from being offended, I was impressed.  Here was an organisation where the culture of participation was clearly infused all the way up to the top.

Not that culture can be imposed from the top – nor can you rely on it percolating up from the bottom[3] – but these two C-level execs were not only modelling the behaviour they expected from the rest of their team, but also seemed, from the brief discussion we had about the process afterwards, to be truly invested in it.  If you can get management to buy into the process – and to be seen to buy in – you are at least likely to have problems with other groups finding plausible excuses to keep their distance and get away with it.

So let’s say that management believes that you should give DevOps a go.  Where do you start?

Developers, tick?[5]

Developers may well be your easiest target group.  Developers are often keen to try new things, and to find ways to move things along faster, so they are often the group that can be expected to adopt new technologies and methodologies.  DevOps has arguably been mainly driven by the development community. But you shouldn’t assume that all developers will be keen to embrace this change.  For some, the way things have always been done – your Rick Parfitts of dev, if you will[7] – is fine.  Finding ways to help them work efficiently in the new world is part of your job, not just theirs.  If you have superstar developers who aren’t happy with change, you risk alienating them and losing them if you try to force them into your brave new world.  What’s worse, if they dig their heels in, you risk the adoption of your DevSecOps vision being compromised when they explain to their managers that things aren’t going to change if it makes their lives more difficult and reduces their productivity.

Maybe you’re not going to be able to move all the systems and people to DevOps immediately.  Maybe you’re going to need to choose which apps start with, and who will be your first DevOps champions.  Maybe it’s time to move slowly.

Not maybe: definitely

No – I lied.  You’re definitely going to need to move slowly.  Trying to change everything at once is a recipe for disaster.

This goes for all elements of the change – which people to choose, which technologies to choose, which applications to choose, which user base to choose, which use cases to choose – bar one.  For all of those elements, if you try to move everything in one go, you will fail.  You’ll fail for a number of reasons.  You’ll fail for reasons I can’t imagine, and, more importantly, for reasons you can’t imagine, but some of the reasons will include:

  • people – most people – don’t like change;
  • technologies don’t like change (you can’t just switch and expect everything to work still);
  • applications don’t like change (things worked before, or at least failed in known ways: you want to change everything in one go?  Well, they’ll all fail in new and exciting[9] ways;
  • users don’t like change;
  • use cases don’t like change.

The one exception

You noticed that, above, I wrote “bar one”, when discussing which elements you shouldn’t choose to change all in one go?  Well done.

What’s that exception?  It’s the initial team.  When you choose your initial application to change, and you’re thinking about choosing the team to make that change, select the members carefully, and select a complete set.  This is important.  If you choose just developers, just test folks, or just security folks, or just ops folks, or just management, then you won’t actually have proved anything at all.  If you leave out one functional group from your list, you won’t actually have proved anything at all.  Well, you might have proved to a small section of your community that it kind of works, but you’ll have missed out on a trick.  And that trick is that if you choose keen people from across your functional groups, it’s much harder to fail.

Say that your first attempt goes brilliantly.  How are you going to convince other people to replicate your success and adopt DevOps?  Well, the company newsletter, of course.  And that will convince how many people, exactly?  Yes, that number[12].  If, on the other hand, you have team members from across the functional parts or the organisation, then when you succeed, they’ll tell their colleagues, and you’ll get more buy-in next time.

If, conversely, it fails, well, if you’ve chosen your team wisely, and they’re all enthusiastic, and know that “fail often, fail fast” is good, then they’ll be ready to go again.

So you need to choose enthusiasts from across your functional groups.  They can work on the technologies and the process, and once that’s working, it’s the people who will create that cultural change.  You can just sit back and enjoy.  Until the next crisis, of course.


1 – OK, you’re right.  It should be “with which we techies tend to struggle”[2]

2 – you thought I was going to qualify that bit about techies struggling with touchy-feely stuff, didn’t you?  Read it again: I put “tend to”.  That’s the best you’re getting.

3 – is percolating a bottom-up process?  I don’t drink coffee[4], so I wouldn’t know.

4 – do people even use percolators to make coffee anymore?  Feel free to let me know in the comments. I may pretend interest if you’re lucky.

5 – for US readers (and some other countries, maybe?), please substitute “tick” for “check” here[6].

6 – for US techie readers, feel free to perform “s/tick/check/;”.

7 – this is a Status Quo[8] reference for which I’m extremely sorry.

8 – for Millennial readers, please consult your favourite online reference engine or just roll your eyes and move on.

9 – for people who say, “but I love excitement”, trying being on call at 2am on a Sunday morning at end of quarter when your Chief Financial Officer calls you up to ask why all of last month’s sales figures have been corrupted with the letters “DEADBEEF”[10].

10 – for people not in the know, this is a string often used by techies as test data because a) it’s non-numerical; b) it’s numerical (in hexadecimal); c) it’s easy to search for in debug files and d) it’s funny[11].

11 – though see [9].

12 – it’s a low number, is all I’m saying.

Using words that normal people understand

Normal people generally just want things to work.

Most people[1] don’t realise quite how much fun security is, or exactly how sexy security expertise makes you to other people[2].  We know that it’s engrossing, engaging and cool, they don’t.  For this reason, when security people go to the other people (let’s just call them “normal people” for the purposes of this article), and tell them that they’re doing something wrong, and that they can’t launch their product, or deploy their application, or that they must stop taking sales orders immediately, and probably for the next couple of days until this are fixed, then those normal people don’t always react with the levels of gratefulness that we feel is appropriate.

Sometimes, in fact, they will exhibit negative responses – even quite personal negative responses – to these suggestions.

The problem is this: security folks know how things should be, and that’s secure.  They’ve taken the training, they’ve attended the sessions, they’ve read the articles, they’ve skimmed the heavy books[3], and all of these sources are quite clear: everything must be secure.  And secure generally means “closed” – particularly if the security folks weren’t sufficiently involved in the design, implementation and operations processes.   Normal people, on the other hand, generally just want things to work.  There’s a fundamental disjoint between those two points of view that we’re not going to get fixed until security is the very top requirement for any project from its inception to its ending[4].

Now, normal people aren’t stupid[5].  They know that things can’t always work perfectly: but they would like them to work as well as they can.  This is the gap[7] that we need to cross.  I’ve talked about managed degradation as a concept before, in the post Service degradation: actually a good thing, and this is part of the story.  One of the things that we security people should be ready to do is explain that there are risks to be mitigated.  For security people, those risks should be mitigated by “failing closed”.  It’s easy to stop risk: you just stop system operation, and there’s no risk that it can be misused.  But for many people, there are other risks: an example being that the organisation may in fact go completely out of business because some *[8]* security person turned the ordering system off.  If they’d offered me the choice to balance the risk of stopping taking orders against the risk of losing some internal company data, would I have taken it?  Well yes, I might have.  But if I’m not offered the option, and the risk isn’t explained, then I have no choice.  These are the sorts of words that I’d like to hear if I’m running a business.

It’s not just this type of risk, though.  Coming to a project meeting two weeks before launch and announcing that the project can’t be deployed because this calls against this API aren’t being authenticated is no good at all.  To anybody.  As a developer, though, I have a different vocabulary – and different concerns – to those of the business owner.  How about, instead of saying “you need to use authentication on this API, or you can’t proceed”, the security person asks “what would happen if data that was provided on this API was incorrect, or provided by someone who wanted to disrupt system operation?”?  In my experience, most developers are interested – are invested – in the correct operation of the system that they’re running, and of the data that it processes.  Asking questions that show the possible impact of lack of security is much more likely to garner positive reactions than an initial “discussion” that basically amounts to a “no”.

Don’t get me wrong; there are times when we, as security people, need to be firm, and stick to our guns[9], but in the end, it’s the owners – of systems, or organisations, of business units, of resources – who get to make the final decision.  It’s our job to talk to them in words they can understand and ensure that they are as well informed as we can possibly make them.  Without just saying “no”.


1. by which I mean “those poor unfortunate souls who don’t read these posts, unlike you, dear and intelligent reader”.

2. my wife, sadly, seems to fall into this category.

3. which usually have a picture of a lock on the cover.

4. and good luck with that.

5. while we’ve all met our fair share of stupid normal people, I’m betting that you’ve met your fair share of stupid security people, too, so it balances out[6].

6. probably more than balances out.  Let’s leave it there.

7. chasm.

8. insert your favourite adjectival expletive here.

9. figuratively: I don’t condone bringing any weapons, including firearms, to your place of work.

On passwords and accounts

This isn’t a password problem. It’s a misunderstanding-of-what-accounts-are-for problem.

Once a year or so, one of the big UK tech magazines or websites[1] does a survey where they send a group of people to one of the big London train stations and ask travellers for their password[3].  The deal is that every traveller who gives up their password gets a pencil, a chocolate bar or similar.

I’ve always been sad that I’ve never managed to be at the requisite station for one of these polls.  I would love to get a free pencil – or even better a chocolate bar[4] – for lying about a password.  Or even, frankly, for giving them one of my actual passwords, which would be completely useless to them without some identifying information about me.  Which I obviously wouldn’t give them.  Or again, would pretend to give them, but lie.

The point of this exercise[5] is supposed to be to expose the fact that people are very bad about protecting their passwords.  What it actually identifies is that a good percentage of the British travelling public are either very bad about protecting their passwords, or are entirely capable of making informed (or false) statements in order to get a free pencil or chocolate bar[4]. Good on the British travelling public, say I. 

Now, everybody agrees that passwords are on their way out, as they have been on their way out for a good 15-20 years, so that’s nice.  People misuse them, reuse them, don’t change them often enough, etc., etc..  But it turns out that it’s not the passwords that are the real problem.  This week, more than one British MP admitted – seemingly without any realisation that they were doing  anything wrong – that they share their passwords with their staff, or just leave their machines unlocked so that anyone on their staff can answer their email or perform other actions on their behalf.

This isn’t a password problem.  It’s a misunderstanding-of-what-accounts-are-for problem.

People seem to think that, in a corporate or government setting, the point of passwords is to stop people looking at things they shouldn’t.

That’s wrong.  The point of passwords is to allow different accounts for different people, so that the appropriate people can exercise the appropriate actions, and be audited as having done so.  It is, basically, a matching of authority to responsibility – as I discussed in last week’s post Explained: five misused security words – with a bit of auditing thrown in.

Now, looking at things you shouldn’t is one action that a person may have responsibility for, certainly, but it’s not the main thing.  But if you misuse accounts in the way that has been exposed in the UK parliament, then worse things are going to happen.  If you willingly bypass accounts, you are removing the ability of those who have a responsibility to ensure correct responsibility-authority pairings to track and audit actions.  You are, in fact, setting yourself up with excuses before the fact, but also making it very difficult to prove wrongdoing by other people who may misuse an account.  A culture that allows such behaviour is one which doesn’t allow misuse to be tracked.  This is bad enough in a company or a school – but in our seat of government?  Unacceptable.  You just have to hope that there are free pencils.  Or chocolate bars[4].


1. I can’t remember which, and I’m not going to do them the service of referencing them, or even looking them up, for reasons that should be clear once you read the main text.[2]

2. I’m trialling a new form or footnote referencing. Please let me know whether you like it.

3. I guess their email password, but again, I can’t remember and I’m not going to look it up.

4. Or similar.

5. I say “point”…

What’s the point of security? 

Like it or not, your users are part of your system.

No, really: what’s the point of it? I’m currently at the Openstack Summit in Sydney, and was just thrown out of the exhibition area as it’s closed for preparations for an event. A fairly large gentleman in a dark-coloured uniform made it clear that if I didn’t have taken particular sticker on my badge, then I wasn’t allowed to stay.  Now, as Red Hat* is an exhibitor, and I’m our booth manager’s a buddy**, she immediately offered me the relevant sticker, but as I needed a sit down and a cup of tea***, I made my way to the exit, feeling only slightly disgruntled.

“What’s the point of this story?” you’re probably asking. Well, the point is this: I’m 100% certain that if I’d asked the security guard why he was throwing me out, he wouldn’t have had a good reason. Oh, he’d probably have given me a reason, and it might have seemed good to him, but I’m really pretty sure that it wouldn’t have satisfied me. And I don’t mean the “slightly annoyed, jet-lagged me who doesn’t want to move”, but the rational, security-aware me who can hopefully reason sensibly about such things. There may even have been such a reason, but I doubt that he was privy to it.

My point, then, really comes down to this. We need be able to explain the reasons for security policies in ways which:

  1. Can be expressed by those enforcing them;
  2. Can be understood by uninformed users;
  3. Can be understood and queried by informed users.

In the first point, I don’t even necessarily mean people: sometimes – often – it’s systems that are doing the enforcing. This makes things even more difficult in ways: you need to think about UX***** in ways that are appropriate for two very, very different constituencies.

I’ve written before about how frustrating it can be when you can’t engage with the people who have come up with a security policy, or implemented a particular control. If there’s no explanation of why a control is in place – the policy behind it – and it’s an impediment to the user*******, then it’s highly likely that people will either work around it, or stop using the system.

Unless. Unless you can prove that the security control is relevant, proportionate and beneficial to the user. This is going to be difficult in many cases.  But I have seen some good examples, which means that I’m hopeful that we can generally do it if we try hard enough.  The problem is that most designers of systems don’t bother trying. They don’t bother to have any description, usually, but as for making that description show the relevance, proportionality and benefits to the user? Rare, very rare.

This is another argument for security folks being more embedded in systems design because, like it or not, your users are part of your system, and they need to be included in the design.  Which means that you need to educate your UX people, too, because if you can’t convince them of the need to do security, you’re sure as heck not going to be able to convince your users.

And when you work with them on the personae and use case modelling, make sure that you include users who are like you: knowledgeable security experts who want to know why they should submit themselves to the controls that you’re including in the system.

Now, they’ve opened up the exhibitor hall again: I’ve got to go and grab myself a drink before they start kicking people out again.


*my employer.

**she’s American.

***I’m British****.

****and the conference is in Australia, which meant that I was actually able to get a decent cup of the same.

*****this, I believe, stands for “User eXperience”. I find it almost unbearably ironic that the people tasked with communicating clearly to us can’t fully grasp the idea of initial letters standing for something.

******and most security controls are.

Next generation … people

… security as a topic is one which is interesting, fast-moving and undeniably sexy…

DISCLAIMER/STATEMENT OF IGNORANCE: a number of regular readers have asked why I insist on using asterisks for footnotes, and whether I could move to actual links, instead.  The official reason I give for sticking with asterisks is that I think it’s a bit quirky and I like that, but the real reason is that I don’t know how to add internal links in WordPress, and can’t be bothered to find out.  Apologies.

I don’t know if you’ve noticed, but pretty much everything out there is “next generation”.  Or, if you’re really lucky “Next gen”.  What I’d like to talk about this week, however, is the actual next generation – that’s people.  IT people.  IT security people.  I was enormously chuffed* to be referred to on an IRC channel a couple of months ago as a “greybeard”***, suggesting, I suppose, that I’m an established expert in the field.  Or maybe just that I’m an old fuddy-duddy***** who ought to be put out to pasture.  Either way, it was nice to come across young(er) folks with an interest in IT security******.

So, you, dear reader, and I, your beloved protagonist, both know that security as a topic is one which is interesting, fast-moving and undeniably******** sexy – as are all its proponents.  However, it seems that this news has not yet spread as widely as we would like – there is a worldwide shortage of IT security professionals, as a quick check on your search engine of choice for “shortage of it security professionals” will tell you.

Last week, I attended the Open Source Summit and Linux Security Summit in LA, and one of the keynotes, as it always seems to be, was Jim Zemlin (head of the Linux Foundation) chatting to Linus Torvalds (inventor of, oh, I don’t know).  Linus doesn’t have an entirely positive track record in talking about security, so it was interesting that Jim specifically asked him about it.  Part of Linus’ reply was “We need to try to get as many of those smart people before they go to the dark side [sic: I took this from an article by the Register, and they didn’t bother to capitalise.  I mean: really?] and improve security that way by having a lot of developers.”  Apart from the fact that anyone who references Star Wars in front of a bunch of geeks is onto a winner, Linus had a pretty much captive audience just by nature of who he is, but even given that, this got a positive reaction.  And he’s right: we do need to make sure that we catch these smart people early, and get them working on our side.

Later that week, at the Linux Security Summit, one of the speakers asked for a show of hands to find out the number of first-time attendees.  I was astonished to note that maybe half of the people there had not come before.  And heartened.  I was also pleased to note that a good number of them appeared fairly young*********.  On the other hand, the number of women and other under-represented demographics seemed worse than in the main Open Source Summit, which was a pity – as I’ve argued in previous posts, I think that diversity is vital for our industry.

This post is wobbling to an end without any great insights, so let me try to come up with a couple which are, if not great, then at least slightly insightful:

  1. we’ve got a job to do.  The industry needs more young (and diverse talent): if you’re in the biz, then go out, be enthusiastic, show what fun it can be.
  2. if showing people how much fun security can be, encourage them to do a search for “IT security median salaries comparison”.  It’s amazing how a pay cheque********** can motivate.

*note to non-British readers: this means “flattered”**.

**but with an extra helping of smugness.

***they may have written “graybeard”, but I translate****.

****or even “gr4yb34rd”: it was one of those sorts of IRC channels.

*****if I translate each of these, we’ll be here for ever.  Look it up.

******I managed to convince myself******* that their interest was entirely benign though, as I mentioned above, it was one of those sorts of IRC channels.

*******the glass of whisky may have helped.

********well, maybe a bit deniably.

*********to me, at least.  Which, if you listen to my kids, isn’t that hard.

**********who actually gets paid by cheque (or check) any more?

Diversity – redux

One of the recurring arguments against affirmative action from majority-represented groups is that it’s unfair that the under-represented group has comparatively special treatment.

Fair warning: this is not really a blog post about IT security, but about issues which pertain to our industry.  You’ll find social sciences and humanities – “soft sciences” – referenced.  I make no excuses (and I should declare previous form*).

Warning two: many of the examples I’m going to be citing are to do with gender discrimination and imbalances.  These are areas that I know the most about, but I’m very aware of other areas of privilege and discrimination, and I’d specifically call out LGBTQ, ethnic minority, age, disability and non-neurotypical discrimination.  I’m very happy to hear (privately or in comments) from people with expertise in other areas.

You’ve probably read the leaked internal document (a “manifesto”) from a Google staffer talking challenging affirmative action to try to address diversity, and complaining about a liberal/left-leaning monoculture at the company.  If you haven’t, you should: take the time now.  It’s well-written, with some interesting points, but I have some major problems with it that I think it’s worth addressing.  (There’s a very good rebuttal of certain aspects available from an ex-Google staffer.)  If you’re interested in where I’m coming from on this issue, please feel free to read my earlier post: Diversity in IT security: not just a canine issue**.

There are two issues that concern me specifically:

  1. no obvious attempt to acknowledge the existence of privilege and power imbalances;
  2. the attempt to advance the gender essentialism argument by alleging an overly leftist bias in the social sciences.

I’m not sure that these approaches are intentional or unconscious, but they’re both insidious, and if ignored, allow more weight to be given to the broader arguments put forward than I believe they merit.  I’m not planning to address those broader issues: there are other people doing a good job of that (see the rebuttal I referenced above, for instance).

Before I go any further, I’d like to record that I know very little about Google, its employment practices or its corporate culture: pretty much everything I know has been gleaned from what I’ve read online***.  I’m not, therefore, going to try to condone or condemn any particular practices.  It may well be that some of the criticisms levelled in the article/letter are entirely fair: I just don’t know.  What I’m interested in doing here is addressing those areas which seem to me not to be entirely open or fair.

Privilege and power imbalances

One of the recurring arguments against affirmative action from majority-represented groups is that it’s unfair that the under-represented group has comparatively special treatment.  “Why is there no march for heterosexual pride?”  “Why are there no men-only colleges in the UK?”  The generally accepted argument is that until there is equality in the particular sphere in which a group is campaigning, then the power imbalance and privilege afforded to the majority-represented group means that there may be a need for action to help for members the under-represented group to achieve parity.  That doesn’t mean that members of that group are necessarily unable to reach positions of power and influence within that sphere, just that, on average, the effort required will be greater than that for those in the majority-privileged group.

What does all of the above mean for women in tech, for example?  That it’s generally harder for women to succeed than it is for men.  Not always.  But on average.  So if we want to make it easier for women (in this example) to succeed in tech, we need to find ways to help.

The author of the Google piece doesn’t really address this issue.  He (and I’m just assuming it’s a man who wrote it) suggests that women (who seem to be the key demographic with whom he’s concerned) don’t need to be better represented in all parts of Google, and therefore affirmative action is inappropriate.  I’d say that even if the first part of that thesis is true (and I’m not sure it is: see below), then affirmative action may still be required for those who do.

The impact of “leftist bias”

Many of the arguments presented in the manifesto are predicated on the following thesis:

  • the corporate culture at Google**** are generally leftist-leaning
  • many social sciences are heavily populated by leftist-leaning theorists
  • these social scientists don’t accept the theory of gender essentialism (that women and men are suited to different roles)
  • THEREFORE corporate culture is overly inclined to reject gender essentialism
  • HENCE if a truly diverse culture is to be encouraged within corporate culture, leftist theories such as gender essentialism should be rejected.

There are several flaws here, one of which is that diversity means accepting views which are anti-diverse.  It’s a reflection of a similar right-leaning fallacy that in order to show true tolerance, the views of intolerant people should be afforded the same privilege of those who are aiming for greater tolerance.*****

Another flaw is the argument that just because a set of theories is espoused by a political movement to which one doesn’t subscribe that it’s therefore suspect.

Conclusion

As I’ve noted above, I’m far from happy with much of the so-called manifesto from what I’m assuming is a male Google staffer.  This post hasn’t been an attempt to address all of the arguments, but to attack a couple of the underlying arguments, without which I believe the general thread of the document is extremely weak.  As always, I welcome responses either in comments or privately.

 


*my degree is in English Literature and Theology.  Yeah, I know.

**it’s the only post on which I’ve had some pretty negative comments, which appeared on the reddit board from which I linked it.

***and is probably therefore just as far off the mark as anything else that you or I read online.

****and many other tech firms, I’d suggest.

*****an appeal is sometimes made to the left’s perceived poster child of postmodernism: “but you say that all views are equally valid”.  That’s not what postmodern (deconstructionist, post-structuralist) theory actually says.  I’d characterise it more as:

  • all views are worthy of consideration;
  • BUT we should treat with suspicion those views held by those which privilege, or which privilege those with power.