Serverless with appfleet

The appfleet cloud platform allows you to easily deploy and host your serverless services in multiple regions globally. Host your serverless web applications on the edge without any limitations!

appfleet is an edge platform

We host your dynamic services globally closer to your users

Globally Load-Balanced

Host your service, site or application on multiple locations at the same time. Be closer to your users and improve your performance and uptime.

Any tech-stack

appfleet is a cloud-based container hosting platform that allows you to deploy your code in any language, framework or technology.

Serverless Containers

Run stateless and stateful containers on a fully managed globally distributed infrastructure. No servers or routing to manage.

What is Serverless Architecture?

Serverless architecture is an increasingly popular computing model in which a cloud provider dynamically allocates and manages back-end resources, allowing developers to focus on code development rather than managing the backend servers and services.

With Serverless architecture, developers determine areas of business logic and write code to tackle each, then upload it to the cloud provider using Functions. These functions are event-driven, which means they only become active upon an access request. For example an HTTP request.

Serverless is, therefore, a combination of Functions-as-a-service and Backend-as-a-Service functionality that reduces operational costs and maintenance, but at the same time it often increases the complexity of application development.

The key thing to note here is that Serverless doesn't mean that the servers have gone away, it just means that you don't need to worry about them anymore.

Drawbacks and Limitations of a Serverless Architecture?

Serverless is an entirely different way of building and operating web systems. In general it makes your life easier but there are also some major limitations which come with it.

  • Vendor lock-in - Currently every serverless implementation is highly dependent on the vendor you are using. That makes it extremely difficult to switch from one vendor to the other.
  • Limited features and tools - When you manage your own server you have full control over it and can configure things the way you want but with serverless the vendor decides how to give you access and control of different features, most of the times you are provided with tools specific to the vendor and you have to build your processes around them.
  • Architectural complexity - Traditional applications are not built to go serverless. You have to refactor your application to ensure it can be run Serverless. Serverless applications follow an entirely different design pattern and that increases their complexity. This increases the time and cost for onboarding your team to Serverless.
  • Cold start - your code eventually has to get executed on some server. By design when your endpoint gets called for the first time or is scaled up it takes some extra time to initialize which results in high latency and bad performance. This is a very common problem with serverless applications.
  • Local testing - In serverless your vendor controls the libraries and configuration of the environment where your code runs. They do give you an option to use packages that you want however there are a lot of limitations. It’s difficult to replicate the same environment locally so as to ensure that everything will work fine in production. Due to this, in the early stages of adoption it takes a lot of time to test things.
  • Loss of control - In non-serverless the entire software stack is under your control. You can choose to download and alter components from the operating system or even the bootloader if needed. Going serverless involves giving up full control of the stack which then resides with the vendor.
  • Limitations with monitoring and logging - Serverless code executes in a stateless environment where nothing persists after the code execution. Due to this you have to depend entirely on the vendor for getting access to logs. Each vendor has their own implementation for this and that can lead to vendor lock-in and additional dev time spent.


With appfleet you can take the advantage of not managing servers and at the same time also overcome a lot of the above limitations.

For instance with appfleet you can create your docker image and instantly deploy it in multiple regions without worrying about servers and management.
With a docker image you get full control over the software stack which also makes local testing seamless.
And since you are using containers you can switch to any other platform when you wish to.
With appfleet you also get additional features like smart load-balancing, failover, managed SSL, logging, monitoring and also console access for debugging.

Why choose appfleet as a Serverless Platform?

Appfleet gives you the flexibility of running serverless and at the same time also saves you from a lot of limitations which serverless offers. On top of this appfleet also offers a wide array of features which makes hosting your applications a breeze.

With appfleet there are two simple steps to host your serverless app globally:
1. Choose your docker image or Github repo
2. Pick regions where you want to deploy
and you are done. Yes, it's that simple. Appfleet internally does all the magic for you and routes the traffic of users to the region closest to them.

Appfleet doesn’t enforce any language or tech stack on you. As long as your application is dockerized - we’ll run it.

Host your Serverless apps

and get a fully automated platform with multiple global POPs, high-availability and low latency

…and other features

appfleet doesn’t end with your product’s deployement

Custom health checks

Configure your own custom health checks per application and we will add them to our own internal checks that we continuously run to ensure your service is alive and well

All languages supported

Node.js, PHP, Golang, Java, Python, everything is supported, thanks to Docker containers. Don't let your technology of choice limit you.

Included HTTPS

For any web service we can automatically install and maintain a LetsEncrypt TLS certificate for free.

Better performance

By using multiple regions at the same time you can lower the latency and easily improve performance for your global audience

Public & private registries

Use any public or private container registry like Docker Cloud, Github Registries, Quay, Google Cloud and more

Console access

Assume direct control of your container by connecting directly to it by using our web console

File Cache

All nodes come with a locally mounted caching filesystem that persists between deployments for improved performance

Logging

We store and process the output and logs of all deployed applications for easier debugging.

Monitoring

All of your instances are constantly monitored. Get historic and real-time CPU, RAM and Disk usage.

Support widget failed to load. Please disable your ad blocker to contact us.