Article
A Fusion of Security & AI
“Concrete jungle where dreams are made of…." Alicia Keys once sang about APIs...oh wait, that was about New York. Either way, the most recent APIDays New York conference was chocked full of relevant takeaways focused on the evolving landscape of APIs, encapsulating the critical facets of security, API development, and the burgeoning influence of AI.
Overall, the greatest takeaway was that we’ve entered the era of composable API management. It’s clear now that dev teams want the ability to have multiple gateways and multiple API portals, and having a flexible platform that supports swapping those pieces in and out with ease is paramount.
API Gateway, Kubernetes
The entire premise of Kubernetes is high availability and scalability. If you are building on Kubernetes, you expect to develop robust, scalable applications on the platform.
But doing so isn’t a given. As you scale up your deployments, Kubernetes requires careful management to ensure you get the necessary resiliency to maintain high availability and optimal performance. You need to consider points of failure, traffic bottlenecks, and dynamic workload management to ensure your application can handle increased load and maintain responsiveness.
This is where the Edge Stack API Gateway can support Kubernetes high availability and scalability. It provides a comprehensive solution to these challenges and helps you achieve Kubernetes high availability and scalability. It ensures a highly available (HA) Kubernetes cluster by integrating with the Kubernetes control plane components, such as the kube-api server and controller manager. Let’s first look at some of your challenges before looking at the best strategies to achieve availability and scalability with the Kubernetes API gateway.
May 2, 2024 | 16 min read
Kubernetes API Gateway
Cloud native development has revolutionized the way organizations build and deploy applications. At the forefront of this transformation is Kubernetes, a battle-tested and widely adopted platform that enables centralized cloud native development. K8s has witnessed remarkable growth and adoption since its inception, and according to a survey conducted by the Cloud Native Computing Foundation (CNCF), 91% of respondents reported using Kubernetes in production, showcasing its dominance in the industry. This trend is expected to continue in 2024 as more organizations recognize the benefits of Kubernetes for managing containerized applications at scale.
The Need for Centralization
Successful cloud-native implementations often require the creation of a centralized platform that brings together various components. This centralized approach provides a "paved path" for developers and engineering teams, streamlining the delivery and release processes. Centralization in cloud-native development best practices can provide several benefits:
April 30, 2024 | 10 min read
Microservices
When you think about it, we’ve made great strides in software development, but each step has brought new and exciting challenges. We started with big, clunky monolithic systems, then advanced to tinier pieces called microservices to promote greater flexibility, scalability, and resilience.
However, with great power comes great responsibility. Now, we have to manage these tiny microservices in our distributed systems. This is where microservice orchestration swoops in to save the day (Yes, I just did a Tobey-Andrew-Tom Spider-Man marathon! My web developer journey feels complete 🕺).
In this article, we will briefly look at what a microservice is, why microservice orchestration is essential, and then dive into nine microservice orchestration best practices that can make the deployment of microservices much smoother.
April 26, 2024 | 14 min read
Developer Productivity
Eat your own dog food. Don’t just talk, the talk—walk the walk.
Drink your own champagne.
Whatever you choose to call it–in 2024, the concept of "drinking your own champagne" has become even more crucial for developer productivity, faster feedback loops, and the creation of better products. This approach, which involves using your own tools and solutions internally, has proven to be a game-changer for organizations across industries, and we, too, practice it! Let's explore how this practice contributes to the success of developers and the products they build, with relevant examples from the current landscape.
April 24, 2024 | 9 min read
Kubernetes
Many organizations using Node.js adopt cloud native development practices with the goal of shipping features faster. The technologies and architectures may change when we move to the cloud, but the fact remains that we all still add the occasional bug to our code. The challenge here is that many of your existing local debugging tools and practices can’t be used when everything is running in a container or on the cloud. A change in approach is required!
In this article, I’ll show you how to debug Kubernetes services using Telepresence and VScode. This will enable you to still utilize your local debugging tools even though your microservice application runs remotely.
Local Debugging with Mocks and Stubs Only Gets You So Far
April 23, 2024 | 15 min read