Startups
Why Blissfully decided to go all in on serverless
Serverless has become a big buzzword of late, and with good reason. It has the potential to completely alter how developers write code. They can simply write a series of event triggers, while letting the cloud vendor worry about providing whatever amount of compute resources are required to complete the job. It represents a huge shift in how programs are developed, but it’s been difficult to find companies who were built from the ground up using this methodology because it’s fairly new.
Blissfully, a startup that helps customers manage their Software as a Service usage inside their companies, is one company that decided to do just that. Aaron White, co-founder and CTO, says that when he was building early versions of Blissfully, he found he needed quick bursts of compute power to deliver a list of all the SaaS products an organization is using.
He figured he could set aside a bunch of servers to provide that burst of power as needed, but that would have required a ton of overhead on his part to manage. At this point, he was a lone programmer trying to prove his SaaS management idea was even possible. As he looked at the pros and cons of serverless versus traditional virtual machines, he began to see serverless as a viable approach.
What he learned along the way was that serverless offers many advantages to a company with a bursty approach like Blissfully, scaling up and down as needed. But it isn’t perfect and there are issues around management and tooling and handling the pros and cons of that scaling ability that he had to learn about on the fly, especially coming in as early as he did with this approach.
Serverless makes sense
Blissfully is a service where serverless made a lot of sense. It wouldn’t have to manage or pay for servers it wasn’t using. Nor would it have to worry about the underlying infrastructure at all. That would be up to the cloud provider, and it would only pay for the bursts as they happened.
Serverless is actually a misnomer in that it doesn’t mean that there are no servers. It actually means that you don’t have to set up a servers in order to run your program, which is a pretty mind-blowing transformation. In traditional programming you have to write your code and set up all the underlying hardware ahead of time, whether it’s in your data center or in the cloud. With serverless, you just write the code and the cloud provider handles all of that for you.
The way it works in practice is that programmers set up a series of event triggers, so when a certain thing happens, the cloud provider sees this and provides the necessary resources on demand. Most of the cloud vendors are offering this type of service, whether AWS Lambda, Azure Functions or Google Functions.
At this point, White began to think about serverless as a way of freeing him from thinking about managing and maintaining infrastructure and all that entailed. “I started thinking, let’s see how far we can take this. Can we really we do absolutely everything serverless, and if so that reduces a ton of traditional DevOps-style work you have to do in practice. There’s still plenty, but that was the thinking at the beginning,” he said.
Overcoming obstacles
But there were issues, especially getting into serverless as early as he did. For starters, White needed to find developers who could work in this fashion, and in 2016 when it launched there weren’t a large number of people out there with serverless skills. White said he wasn’t looking for direct experience so much as people who were curious to learn and were flexible enough to deal with new technology, regardless of how Blissfully implemented that.
Once he figured out the basics, he needed to think about how this would work structurally. “Part of the challenge is figuring out where do you draw the boundaries between different serverless functions? How do you think about how much you want to overload the capability of one function versus another? How do you want to split it up? You could go way too specific, and you can of course, go way too broad. So there’s a lot of judgement calls to be made in terms of how you want to split your code base to work in this way,” he said.
The other challenge he faced going with a serverless approach so early was a dearth of tooling around it. White found Serverless, Inc right way, which helped him with a basic framework for developing, but he lacked good logging tools and says that the company still struggles with this even now. “DevOps doesn’t go away. This is still running on a server somewhere (even if you don’t control that) and you will run into issues.” One such issue he calls a “cold start issue.”
Getting the resources right
Blissfully uses AWS Lambda, and as their customers require resources, it isn’t as though Amazon has a set of dedicated resources set aside waiting for such an event. If it needs to start servers cold, that could result in latency. To compensate for that, Blissfully runs a job that pings Lambda continually, so that it’s always ready to run the actual application, and there isn’t a lag time related to starting from scratch.
The other issue could be the opposite problem. You can scale much faster than you’re ready to deal with and that can be a problem for a small team. He says in that case, you want to put a limiter on the speed of the calls so you don’t end up spending more than you can afford, and it doesn’t scale beyond your team’s ability to manage it, “I think, in some ways, this actually accelerates you running into problems where you would normally be larger scale before you really had to think about them,” White said.
The other piece is that once Lambda gets everything going, it can move data faster than your external APIs can handle, and that could require limiters to actually slow things down. “I never had that problem in the past where I was provisioning so many computational resources that Google was yelling at me for being too fast. Being too fast for Google takes a lot of effort, but it doesn’t take a lot of effort with Lambda. When it does decide to spool up whatever resources, you can do some serious outbound damaged to other APIs.” That meant he and his team actually had to think very early on about building sophisticated rate limiting schemes.
As for costs, White estimates that his costs are much lower now that he has the service built and in place. “Our costs are so low right now, and far lower than if we had server-based infrastructure. Our computational pattern is very bursty.” That’s because it re-parses the SaaS database once a day or when the customer first signs up, and in between, usage is fairly low beyond interacting with the data.
“So for us that was perfect for serverless because I don’t really need to keep capacity around that would be pure waste.”
-
Entertainment6 days ago
Earth’s mini moon could be a chunk of the big moon, scientists say
-
Entertainment7 days ago
The space station is leaking. Why it hasn’t imperiled the mission.
-
Entertainment5 days ago
‘Dune: Prophecy’ review: The Bene Gesserit shine in this sci-fi showstopper
-
Entertainment5 days ago
Black Friday 2024: The greatest early deals in Australia – live now
-
Entertainment4 days ago
How to watch ‘Smile 2’ at home: When is it streaming?
-
Entertainment3 days ago
‘Wicked’ review: Ariana Grande and Cynthia Erivo aspire to movie musical magic
-
Entertainment2 days ago
A24 is selling chocolate now. But what would their films actually taste like?
-
Entertainment3 days ago
New teen video-viewing guidelines: What you should know