This week I attended the O'Reilly Open Source Conference 2017. This was my third OSCON, and probably the most interesting yet.
OSCON 2002 was my very first tech conference. I didn't know what to expect, nor how to get the most out of the event.
To be completely candid, the main reason I went was to see the keynote presentations. I was most interested in the Weta Workshop presentation, but the opportunity to see RMS and Lawrence Lessig were big daws, too.
I didn't know anyone personally, though I was familiar with a few names of open souce luminaries that were there. I knew of ESR, cmdrtaco and hemos of Slashdot fame, and a few other folks, but that was about it.
It was an intimidating and overwhelming experience, and I didn't take as much away from it as I likely could have.
Four years later I was back for OSCON 2006. This time I knew Rich Bowen, having met him at the Ohio LinuxFest where he was a speaker, so I at least had someone to eat meals with. I also had a slightly better idea of how to navigate a tech conference.
It was at this event that Google announced their Google Code project for providing Subversion hosting for open source projects. This was in direct competition to SourceForge, and was kind of a big deal at the time. Google has since exited the code hosting business, allowing GitHub and its bretheren to take that task.
Having had a chance to review the last two OSCONs I attended, and the numerous conferences I've attended since, as well as my personal and professional growth in the intervening years, I felt it best to record some thoughts on OSCON 2017 now, while they're still fresh.
Two different sessions touched on different aspects of the same thing, both of which struck a nerve for me. Application Security: From Zero to Hero was a high-level and largely non-technical examination of the speaker's efforts to front-load security and compliance into the application development process. This is, of course, an obvious thing to do and yet many developers and organizations don't do it (very well) for a variety of reasons.
At one point, the speaker had a slide of Gandalf from the Lord of the Rings movie, where he's telling the Balrog "You shall not pass!" This, according to the speaker, was the typical attitude of security and operations teams when looking at developers trying to push code to production. It was funny, but it also rang true for many in the audience.
The speaker observed that, with very few exceptions, no developer was actively trying to harm the organization by deploying poor code. And yet the things about which developers, ops, and security care are not all the same, so some teams feel the need to slam on the brakes. By moving as much of this to the beginning of the process, through early static code analysis, software composition analysis, and the use of related tools, the entire IT focus can shift from "good deployments" to "good citizens". Everyone has skin in the game, and it's unlikely (though, sadly, not impossible) that any one person is actively working to cause trouble, so we should be working to elevate one another proactively and collaboratively.
The very next session I attended was "The Paved Road at Netflix: At the junction of freedom and responsibility", by Dianne Marsh, the director of the Netflix tools engineering team. Her team is responsible for supporting those things that are commonly used throughout Netflix, but their team is not in a position to enforce the use of said things. Development teams are free to make their own choices, but they have a responsibility to strive to do what's best for the organization.
It was an interesting talk, because my current team at my current employer is in a very similar situation. Ideally, we build things that other teams can use to be more productive with their own objectives, but we can't force anyone to use the stuff we build. This has caused a variety of friction and tension, as we try to provide value while respecting differing needs.
At Netflix, Marsh's team sets very clear agreement with her internal customers regarding what they will and will not support. If a team wishes to explore a technology or tool that is not currently supported, the team is empowered to do so. But they may be on their own while they explore, and depending on the overall utility of the solution and the other priorities already in flight on Marsh's team, that shiny new project may not ever get adopted anywhere else.
The analogy, from the title, was all about roads. Common tools and tech are like an interstate highway: well maintained and with regular reststops. You should expect no problems using these things. Less common tools are still available for your use, but you won't get quite the same level of support: consider these as state roads. There will be gas stations, but there may be potholes. If you venture off onto the county roads, don't expect as many gas stations and be prepared to find your own way.
This analogy engenders a very interesting attitude within Marsh's team. They're not in a position to tell any other team "No, you can't do that." But they can absolutely say "We can't help you with that." This is not an entirely negative response, though. Marsh says that she wants other teams to have "informed adventures" rather than "accidental detours". If another team wants to explore some new thing that they think will help them, Marsh's team sets the stage that they're on their own for the design and support of that thing. If it proves successful and sustainable, that team is then expected to work collaboratively with Marsh's team to add it to the list of supported technologies.
The adventurer charts the course, she said, and the cost to them is to figure out how to get back to the pavement. Marsh's team has an obligation to help these adventurers bring back what they've learned.
This entire paradigm is empowering, and invigorating, and honestly fascinating to me. It's worth exploring, but a couple of realities dampen the awesomeness. First, Netflix's business is streaming videos, so the risk associated with experiments is relatively low. Netflix's customers don't have Service Level Agreements with financial penalties for violations. In a related vein, Netflix is not in a regulated industry like health care or finance. When I asked Marsh if she thought the Paved Road model might work in such environments, she said no. Finally, Netflix only hires senior developers who have had some level of experience "in the world". The maturity they bring is a large part of the reason they can be empowered to make the decisions they do. Marsh said she would not feel comfortable granting these opportunities to a fresh-from-college developer.
Both the security and Netflix session had some interesting thematic overlaps. No one sets out to do something stupid. We need to find ways to empower individuals and teams to produce the most value for the organization. As one of the guys on the hook for supporting the things that developers create, I admit it doesn't come naturally for me to trust or empower. I've seen a lot of smart and well-intentioned people make gallingly short-sighted and unsustainable decisions. The security domain is sufficiently complex and nuanced that everyone is likely to have a different opinion of who is responsible for what. Devs are likely to ask "Can't we just use the firewall?" or "Can't we just filter that at the load balancer?" Operations are likely to ask "Can't you store this in the database? Why do you need Redis?"
These two sessions alone have given me a lot to think about, and have made OSCON 2017 worth attending. The Kubernetes training I took was icing on the cake!
At lunch today, I sat with a few strangers in order to meet some new people. While chatting about the conference, one of them observed "When you're at an IBM or a Red Hat conference, you're at an IBM or Red Hat conference, you know? But here, there's so much different stuff, from so many different people." It can feel a little diffuse at OSCON, without a single unifying vendor or theme; but I actually think that's a good thing. It allows the attendees to better select the sessions that align with their interests, and take away a broader perspective of technology.
I wish I had known some of this back in 2002, so that I could have gotten much more out of my first tech conference.