James Snell has been with NearForm since March 2017 and is based in Central California, near Fresno. He works closely with Conor O’Neill to lead NearForm Research. Here we caught up with him on everything from Open Source projects to California wildfires and he also gives us a quick intro to the current work on HTTP/3 support for Node.js and the performance improvements it will deliver.
Firstly, can you tell me what you do?
I’m Head of NearForm Research which was recently established to give focus to & formalise what we as NearForm have always done – research, build and contribute to open source projects. I’m also on the Node Technical Steering Committee, the body that oversees the Node.js project within the OpenJS Foundation.
What sort of things do you do, as Head of Research?
I play an active part in coordinating NearForm contributions to open source projects we use, whether that’s React, Node.js, and modules that build on them. I head up the technical team – but I also handle some consulting work with clients from time to time. It’s great to have that experience of working on live projects and understanding of enterprise environments.
I enjoy my role at NearForm Research because I get to focus on the technology itself. Conor is the one who has to give the most consideration to the commercial benefits. We will both be engaging with the wider ecosystem to understand the drivers and the application of technologies across industries. And to collaborate on projects. We’re already working with FINOS in the financial services space to do just that.
What’s one example of a project you’re working on ?
Our project to build QUIC and HTTP/3 support for Node.js. It’s something that the entire Node.js ecosystem can benefit from. We are also contributing other features to Node.js – like Worker threads – with an aim to making significant improvements to how Node.js itself runs and how it’s structured internally.
Can you describe what impact HTTP/3 will have? What was wrong with the first two versions of HTTP?
HTTP/1 has been here for more than 20 years – it’s legitimate to ask why we need something new, because it’s perfectly fine for the majority of cases.
Multiplexing is when you can send multiple data requests/responses at once, but there is a head-of-line blocking issue. If you’re loading a large image file and small text file, the text has to wait.
So HTTP/2 enabled them both to be sent at the same time, but it caused problems lower down. These protocols used strictly ordered packets – if one packet is lost, everything else has to wait until that packet is resent.
With /1 if a packet was lost, you only block the one connection, but with /2, if a packet is lost, all 20 connections (as one example) will have to wait – so you end up with worse performance in long term connections in a number of cases.
HTTP/3 uses a completely different approach, using UDP instead of TCP at the lower level. UDP takes a block of data and sends them in any order – there is no ordering dependency between packets at all. That’s the characteristic we want for HTTP3 – allowing multiplexing but eliminating the head of line blocking issue. On the other hand, it consumes more memory and processor power so there are definite tradeoffs to be made.
When you build a site do you use a specific version of HTTP?
Developers really shouldn’t have to care. You can write your app to support /2 in Node.js but it is probably not going to get used for a variety of reasons. It depends on middleboxes at the proxy layer. It’s transparent to the user.
HTTP/3 is different because proxy / middleboxes have committed to support it on the front- and back-end, so developers should actually be able to make use of it. We should be able to get clear metrics on it.
When will developers start using it?
Hopefully near the beginning or middle of 2020. The IETF is currently writing the specs for /3. None of the implementation is out for production use until that finishes but there are plenty of implementations being worked on. Cloudflare has really been leading much of the effort on these and we’ll be working on interop testing with their work soon!
I respected the way NearForm was a part of the open source community – contributing to it, but without trying to dominate it.
Where did you work before NearForm ?
I was at IBM for 16 years doing a variety of things all involving open standards, open source, and research.
What made you join NearForm?
I respected the way NearForm was a part of the open source community – contributing to it, but without trying to dominate it. NearForm takes a humble approach that focuses on what the community needs more so than what NearForm can get out of it.
What’s good about working at NearForm?
The people are phenomenal in terms of skill set. Every employer tends to have people you just don’t want to work with. I haven’t experienced that yet at NearForm.
Are there many others on the West Coast?
We have one person in Vancouver, two in the Bay Area and more being added all the time. I travel over to Ireland around five to six times per year — often enough that the passport control folks are starting to recognize me as I come through.
Did you model NearForm Research on anything you’d seen other companies do?
No, we specifically didn’t do that. Most companies are focused on developing proprietary IP for the company that can be leveraged and sold : ‘Let’s develop a new capability that we can make money from.’
NearForm Research is about contributing to the entire ecosystem. We are passionate about giving back to the community and helping with the sustainability of open source. It also means we can continue to build on our deep expertise for the benefit of our clients: ‘Hey, we helped develop this and we’re the best people to talk to about this.’
What’s your biggest challenge?
Open Source sustainability. Open source code is being developed, supported and maintained by individuals in their own time, often when they’re off work. They aren’t compensated for building it, but then companies like Google, Amazon, or even NearForm are deriving a significant amount of value from it. Across the board, it’s saving companies money, but very little of that value makes its way back to the maintainers – the people who are giving up their weekends fixing bugs.
Then we see examples of these maintainers burning out – so we need to find more creative ways of supporting them.
Open Source sustainability – it’s a matter of finding the right formula?
I think we’re the people to solve this. NearForm has an amazing reputation for supporting the ecosystem, rather than just consuming it.
But how do you determine the value of Open Source?
I think there is an answer, but it requires a lot more thought. Just one of our developers (now also Technical Director), Matteo Collina, has 500+ modules published. One of those is downloaded 10,000 times a month, but it’s impossible to know how much it’s actually used, or how critical it is to a project.
One example that I like to use is this: Imagine a project that’s been worked on for a year by a team of 100 developers, all of whom work very hard and reach all their goals and milestones but the finished project is only used by a small handful of customers. Contrast that with a single developer who spends a single weekend writing up a small open source library that ends up being used by millions. How do you measure the value of those projects, and the value delivered by their developers? How do you compare them? It’s a difficult problem that has no easy solution.
How do you think companies who use Open Source should be giving back?
This is a complex question because there are many ways they can, but it really comes down to ‘contribute!’ – Contribute code, contribute documentation, help respond to questions from users, help test and identify bugs. Compensate your developers for the time they are working on open source projects that bring value to the organization. Provide incentives for getting involved, and provide training on open source. Most importantly, find ways of financially supporting the open source developers you depend on the most.
What I enjoy more than anything else is instigating others to pick things up and run with them — like when I work on some new feature of Node.js and suddenly there are three or four new contributors to the project who pick that up and do even more with it.
What is the Clinic.js tool that you are also working on?
Whenever a customer would say “we’re having a performance issue, please help us fix it”, we’d send Matteo or myself to look at the code, spend a few days looking at it, and then we’d communicate what was wrong with it.
We wanted to automate this process. We called it “Matteo-in-a-Box” for a few weeks.
Clinic Doctor does an initial performance analysis. It runs the software, collects metrics, analyses them, looks for common faults, and gives specific recommendations. Clinic Flame and Clinic Bubbleprof give much more detailed information, in specific areas.
For a while, we considered selling these tools as a product line, but there’s more value in it being a free OS tool that anyone can use.
Let’s talk about yourself, so people can get to know you. Where did you get your passion for building software?
Really that’s a difficult question for me. I’ve been writing software in some form or another since I was 8 years old. It was just a hobby that I really enjoyed until I figured out that I could actually make money doing it! From there I just discovered a passion for problem solving. I see something that I think I can add value to, something that I think really should be done and I do it. What I enjoy more than anything else is instigating others to pick things up and run with them — like when I work on some new feature of Node.js and suddenly there are three or four new contributors to the project who pick that up and do even more with it.
Do you have to worry about the wildfires in California?
No, they were further south and further north, but we do get all the bad air from the rest of the state. We’ve had drought in the past. It was so bad that the ground level went down by three feet in some places. The general lack of rain and a lot more fires are the two big challenges in this state – but lately it has been raining a lot more. Having lived in the Central Valley for my whole life, and having really paid attention to what is happening, it is clear that Climate Change is having a dramatic impact.
What do you do in your free time?
My way of dealing with work stress is to go build something that requires no electronic parts! For example, my old house ended up with two additional rooms thanks to the work I was doing on the Node.js project. We moved onto 3.2 acres of land a year ago so there are lots of new projects around the house. I’m rebuilding a workshop, I have a lot of fences to put up, and landscaping to do. I try to get away from the laptop as much as I can.
Thank you James for taking the time out to give us a glimpse into what NearForm Research is about and for sharing your views on how enterprises can play their part in driving open source sustainability – as well as benefiting from the many revolutionary projects you are working on!