API Development
I'm pretty sure we all witnessed–if not lived through the disruption of–one very expensive bad code release and IT outage this summer. As someone who spent decades in software development, I can tell you with confidence that although the actual costs of that mistake were upwards of one billion dollars, the emotional toll on the developers involved was probably more. It was a lose-lose situation for all involved. I know developers are pushed harder than ever to code faster, release more often, and–" oh, by the way"–test their code. Meanwhile, infrastructures get more complex by the minute as regions, offices, networks, apps, services, machines, and containers multiply.
As a former developer, I understand the pain of building, testing, and deploying your own code (which, by the way, is even more complicated than ever in today's cloud-native, microservices-driven world). Now, as a CEO, I also understand the pain (or at least the very real fear of the pain) of a potential billion-dollar mistake. Regardless of whether you relate to either of those positions, on a less sensational note, I think we can all appreciate the role of healthy growth in a business's success today and the critical part that software plays.
Let’s take APIs, for example. A subset of all of the things that developers might be building, but - importantly - an increasingly vital necessity and one of the biggest drivers of volume, urgency, and security. Application Programming Interfaces; it’s an awful acronym, I’m sorry. Despite that, it's a heck of a powerful thing. These little pieces of code are the language that enables all of our devices and various applications to connect and share information. I'm certainly not the first to say it, but APIs are the essential building blocks of the modern world.
July 29, 2024 | 9 min read
API Gateway
When building applications with APIs, choosing the right architecture for the job is key.
APIs can be defined by SOAP, GraphQL, gRPC - the list goes on and on. In fact, any interface between two pieces of code is an API. After all, APIs are application programming interfaces.
Here, we’ll examine when RESTful APIs are often the first and best choice—they’re nice, neat GET and POST endpoints with developer-friendly URLs. REST still holds about 90% of the market (our friends at Postman keep track of those stats in a great annual report you can find). But there are also times when you may consider using a different API protocol like gRPC in your application.
July 25, 2024 | 12 min read
API Gateway
What is loading balancing in Kubernetes?
Load balancing is the process of efficiently distributing network traffic among multiple backend services, and is a critical strategy for maximizing scalability and availability. There are a variety of choices for load balancing Kubernetes external traffic to Pods, each with different tradeoffs.
Selecting a load balancing algorithm should not be undertaken lightly, especially if you are using application layer (L7) aware protocols like gRPC. It’s all too easy to select an algorithm that will result in a single web server running hot or some other form of unbalanced load distribution.
July 24, 2024 | 6 min read
API Gateway
A modern API gateway like Edge Stack Kubernetes API gateway empowers organizations with a cost-effective solution to harness the full potential of their microservices architecture by streamlining application development and management in a Kubernetes ecosystem. One area where Edge Stack really shines is in continuous delivery testing on many levels.
For example, you can deploy a new service or an upgraded version of a service into production and hide, or “cloak,” this service from end-users via the gateway. This effectively separates the deployment and release process, allowing you to run acceptance and nonfunctional tests on the cloaked service, such as load tests and security analysis. You can also perform canary testing by allowing a small amount of user traffic to flow to this new deployment.
There is also potential to use a gateway to “shadow” (duplicate) real production traffic to the new version of the service and hide the responses from the user, and “shift” traffic around to focus load on a specific cluster of your system. Finally, you can use an API gateway to implement and control chaos testing. These techniques allow you to learn how your service will perform under realistic use cases, load, and failure scenarios, which are critical for continuous delivery testing.
July 19, 2024 | 5 min read
API Gateway
You’ve probably bumped into the term “dark launch” when reading about software feature releases by companies like Facebook, Google and Amazon etc. In fact, Facebook coined the term when describing the launch of a new chat feature. They called it a dark launch because they deployed the code responsible for the new chat service to a small segment of their audience at a time. This allowed the engineers to monitor the new service using real production traffic without impacting the broad Facebook population’s user experience.
What is Dark Launch?
As in the Facebook example, a dark launch is an approach for incrementally releasing production-ready software features to groups of users, enabling you to assess real-time feedback and make changes before launching widely. Similarly, you can use this process for deploying code changes and exposing them only in a limited environment without any traffic going through their exposed parts—think beta testing but with no humans involved.
July 16, 2024 | 4 min read
API Gateway
One of our new segments, Community Corner, features weekly deep dives into common questions we get in our Community across our products: Edge Stack, Telepresence, and Blackbird. As one of the core members of our customer team, one of the most common questions I see revolves around the key differences between our open-source offering, Emissary-Ingress, and our commercial product, Edge Stack API Gateway.
The TL/DR:
Edge Stack is Ambassador’s licensed API Gateway. It’s a closed-source product that has been adopted by companies in various industries around the world to manage traffic to their cloud-based services.
July 12, 2024 | 8 min read