Its is hard to find any information on how cloud services handle a sudden spike in number of requests, which makes it hard to have an informed decision when picking a cloud service. Here I attempt to list some of the cloud service providers autoscaling behavior.
A common question I get from devs new to JS is: How do I do “parallel” HTTP/API calls with JS?
Say your external client(s)/partner(s) wants to be notified of certain events happening on your system. So you decide to push events/messages to their systems via HTTP (webhooks), and your client have an end-point that accepts the requests. Here’s a security problem your client has to deal with: they have an end-point open to the public internet which could be abused by bots / bad actors. How do you let their system know that a request they have received is indeed from your service and is not a bot / spoofed / spam request?
Last year I wrote a blog post on concurrency control and ensuring data consistency while caching things to redis from multiple processes.
Recently I revisited the blog post again and I couldn’t help wonder how over-complicated my final solution was. So here I correct my last solution.
How would you create a job queue that does not have duplicate entries? That is, deduplicated by the publisher instead of the consumer.
If you have dealt with any form data & database, you have had the need to assign unique IDs to data saved. Many a time these IDs are internal, but sometimes IDs need to be used in other ways by your users. e.g, they need to paste it, email it or speak it over a phone etc. Here is where a nice short readable ID would be beneficial to the user.
Lately I’ve been trying to re-implement a cache library using redis. The (simplified) requirement is to use cached value for a given key if present else fetch fresh value and cache the value to redis (on the first time or the first time after cache value expires). Also assume that fetching fresh value, even once, is a super expensive operation (it causes heavy load on the database).
Seems like a simple requirement, right? But one does not simply write distributed system code!
I am sure you’ve come across cases where you need to search for items multiple times in the same array. And you have written the code in an easy to read way, but not exactly the most performant way. Right?
Since I last wrote my post on background removal in 2016, I’ve searched for alternative ways to get better results. Here I will dive into my new approach.
What a better way to start a fresh website than speaking about how I started it fresh?
Previously my website was a custom written static website generator with Mozilla’s nunjucks library hosted on Rackspace cloud files. The stuff that I liked about this setup compared to say using Wordpress or Ghost blog was that
- the website was fast to serve. It’s just static files over a CDN
- I had no headache maintaining the platform.. like worrying about backups, upgrading, security updates.
- the hosting was free. Rackspace cloud files is storage + CDN is free.
What I didn’t like about this setup was the writing experience. It’s was all hand-written HTML and nunjucks templates. And even though I am proficient with HTML, it isn’t as convenient as say Wordpress or Ghost blogs. I wanted a better writing experience, but by still keeping the good things about a static website.
So I went searching for a better way and found couple of new & old tools
- Page 1 of 2
- Older →