Saturday, 22 May 2021

Introduction

This article is a step by step tutorial on how to bypass CORS restrictions and integrate external resources in your website by implementing a simple proxy using a combination of two AWS services: API Gateway and Lambda. We will be creating a proxy against a real world example - for educational purposes only- . This is intended to expose the problem and to help people grasp the implementation steps. To follow along, a basic AWS terminology knowledge with a valid account, nodejs and -of course- CORS are highly recommended.


Definition & Problem exposing

Most of the time, you have been developing on your local machine, exposing a local web server at http://localhost:8080. Your website not working properly so you open the google chrome developer console to see the dreaded CORS error:

 

This showcase a typical CORS alert and to lay it down to you, it is a browser security mechanism that automatically blocks solicitation of resources hosted in a different origins from which the original page was loaded, and authorizes it only when the requested resource are specifically whitelisted  (This allow/bloc behavior is defined at the web server level). To read more, check the official documentation here.
 
 The real world resource we will be using is this api endpoint which is used to build the dropdown list in this page:  


This works pretty well, but what happens when we try to use the same API endpoint in our website, hosted at https://www.azaytek.com? Yes, you guessed that right: CORS will prevent us from doing so and we are stuck 😣

Solutions

To solve this, we can pretty sure build our own proxy server using any tech stack we choose and then host it somewhere. The main disadvantage of this is that we need a 24/7 running server, even when it's yawning and not receiving requests. Here comes the interesting pay-as-you-go AWS cloud payment model: we are charged only for what we're really using, keeping in mind that the AWS API Gateway + AWS Lambda combination for our use case is pretty cheap, almost 3.5$ for 1 millions requests: Lambda pricing / API Gateway pricing without forgetting that we are using AWS performant, highly available and scalable infrastructure.

To implement our solution using AWS services, we will use AWS API Gateway to expose a public HTTP URL. This URL, when invoked, will trigger an AWS Lambda function that will do the concrete request ( I will be using nodejs & the awesome fetch package but you can use any technology as long as it is supported by Lambda). Once it gets the original response , it passes it to API Gateway to return it back to the client. Here is the overall architecture:

AWS Lambda setup

We will create a lambda function using nodejs and the fetch package. The key point here is that the fetch package is not included by default and we need to upload the full project as a zip file ( index.js, package.json and the node_modules/ folder). Keep in mind that you can implement the get request in any technology supported by AWS Lambda.

Implementation:

Steps to deploy the code:

To create the function, kindly follow the steps described here: https://docs.aws.amazon.com/lambda/latest/dg/getting-started-create-function.html. Once created, you can go to source code's section and write your code in the online editor or, as in my case, simply upload the project as a zip file


At this level, we have an AWS Lambda function that's working. To test this, We go to the "Test" tab and we need to pass it a JSON object that reflects the JSON object that our function would receive if it was triggered by AWS API Gateway: the test input should look like this


Once done, we click "Call" and we will see the result of our function execution and whether it was successful or not. Looking at the details we will find more information, here is a sample: (We can see that our function did the GET  https://psl.service-public.fr/services/public/rest/communeOrCp/ by passing it a terms parameter equals to "paris", as set in test input data)



πŸ‘Bonus Tip: to get more insight on the data structure the lambda function would receive when triggered via API Gateway, you can take a look at https://docs.aws.amazon.com/apigateway/latest/developerguide/set-up-lambda-proxy-integrations.html#api-gateway-simple-proxy-for-lambda-input-format.

Now, our Lambda function is setup and works when we test it from the AWS dashboard or invocked via API, but how could we make it publicly available to the world (internet)? Yes, you guessed that right: AWS API Gateway is the solution.

API Gateway setup

Introduction

To give you a brief and concise introduction of this AWS service, here is a quoted definition from the official documentation:

Amazon API Gateway is a fully managed service that makes it easy for developers to create, publish, maintain, monitor, and secure APIs at any scale. ... Using API Gateway, you can create RESTful APIs and WebSocket APIs that enable real-time two-way communication applications.

For plain English definition of AWS API Gateway - alongside others AWS services you can check this simple, yet awesome article.

Let's proceed.

Create the Gateway trigger

In order to create and deploy our public HTTP endpoint: we will create a new API Gateway API that will trigger our previously deployed Lambda function. This API endpoint will receive our URL parameters, passes them through to our lambda function that will execute the external request and get the response back to the client.

To this, simply, on our Lambda function page, in the "Function overview" section at the top of the page, click on "Add a trigger" and then select API Gateway.


  This will prompt you to create a new API. Choose that option, and select HTTP API. 


Give it a name and continue (At this stage you can enable CORS in the additional parameters section, but to keep it simple, personally I recommend to do it later). For this tutorial only, I choose to make it an open API by choosing "open" in security field.



Once done, click "Add" at the bottom right. This will create a new gateway with a randomly generated URL and will set it as a trigger of our Lambda function. Note that this is the base URL, so if you want to call this to trigger your function you have to concatenate it to the name of your trigger you defined. Example:

πŸ“ With a base URL [ https://alkj54au.randomely-generated-url.com/default], with a Lambda trigger named [ my-trigger-api ], the full URL becomes [ https://alkj54au.randomely-generated-url.com/default/my-trigger-api ]. Note that this is the name of the trigger, not the name of the Gateway.

If you are lazy like me, you can get the public endpoint when switching to the Lambda function configuration tab:


Now go to API Gateway dashboard page and you will find the API you've just created, with the name you gave it to


Click on your Gateway and you will see our Gateway dashboard page, and its configuration options
 

❤️‍πŸ”₯ Dada 😎! We have a public URL that triggers our Lambda function. Call it from your browser and you should get a response. Now, we will passe our Gateway URL parameters to our Lambda function. Simple, keep reading.
 
➕ If you want to configure CORS to your endpoint, security mechanism or throttling, deployment environments etc. you can (and should) do it in the gateway dashboard page. This is beyond the scope of this paper

Pass the URL parameters to Lambda

[ This is was mentioned in section, but I would like to reformulate since we already draw the 360° picture ]

Lambda function when invoked, gets the event data source as a json format. To get a grasp of data that gets passed to Lambda using different types of trigger you can this resource. We interests us here is the API Gateway data source:

 

The HTTP request that was sent to our Gateway is proxied to or Lambda function in the above format, URL parameters are present in  the "queryStringParameters". So for example, when calling our gateway with https://www.mygatewayurl.com/my-trigger-name?a=1&b=2 you will have a JSON "queryStringParameters" as:

 
From there you are free to use this data to execute the external, original HTTP request and get back the response. Done. Long but worth it.


  Conclusion

This is the end of this tutorial. We have seen how to create a HTTP proxy by combining AWS Lambda and API Gateway in order to bypass CORS restriction on publicly exposed resources in the internet. I hope this was helpful and I would like to insist that we are living in such a fascinating cloud and internet era where problems and solutions are everywhere! Everything is possible and the sky is the limit! As always, your feedback, questions or suggestion is highly welcome πŸ˜ƒ

 

Tuesday, 2 March 2021

Introduction

This article is a step by step tutorial on how to setup an ssh tunnel -reliably ensuring it's running, and automatically restart it when necessary- to forward local traffic to remote destination. This works on servers running Linux and is intended to help people setup an ssh tunnel to connect to external services ie. mysql that's running on remote server. To follow along, a basic knowledge of ssh, Linux shell and cron is highly recommended. 

SSH TUNNEL Linux tutorial local port forwarding

Basics

During your programming/IT career, there is a high chance that you've already used SSH. Whether connecting to a remote server or cloning a git repository. Either way, both involving SSH on a basic level. But beyond these two uses-cases, SSH offers some others functionalities and purposes which may boost your workflow and sometimes are necessary. We are talking about SSH tunneling -aka local or remote port forwarding. In this article our main focus is the local port forwarding, especially when trying to connect from your application/script code to an external remote server; as an example, connecting nodejs script to mysql database in a remote server through ssh. Finish talking, let's get the hands dirty!

Discussing possible approaches

Since we want to access a remote service, mysql database from a nodejs script as an example, we can use the tunnel-ssh npm package. This is absolutely doable, especially when you don't have enough access to create a system-level ssh tunnel using the command line. The main problem regarding this approach is that both the software and the platform are tied together. Personally, I don't want them tied and I prefer separating concerns: your code shouldn't know about any ssh tunneling, and your ssh tunneling shouldn't depend on a particular script to be running.

Setup

We are assuming that you have two running servers (Server_A acting as the client, Server_B acting as the remote database server, running mysql on the default 3306 port), and you can connect from Server_A to Server_B using your favorite ssh client, i.e simply: ssh -i my_private_key server_b_user@server_b_ip ( Personally I prefer using password-less ssh login using keys, but the same tutorial applies if you want to use ssh password-full login, the main difference is that you will be prompted to enter it)

Having our requirements met, creating an ssh tunnel from Server_A to Server_B is quite simple, its a single command that you have to run:

 ssh -L [local_ip:]local_port:localhost:remote_service_port -N -i my_private_key server_b_user@server_b_ip

  •  -L: here we specify the type of port forwarding ( -L for local, -R for remote and -D for dynamics )
  •  local_port: The local machine IP address and port number. When local_ip is omitted, the ssh client binds on the localhost.
  •  localhost:remote_service_port: The IP/Hostname and the port of the remote server's service.
  • -N: not to execute a remote command
  • -i my_private_key, server_b_user and server_b_ip: the key, the ssh user and the server IP address.
In our specific use case -trying to connect to a remote mysql service running on Server_B- the above command will be (I choose local available port 33066 but you could use any available port bigger than 1024. Ports =< 1024 are root restricted):

 ssh -L 33066:localhost:3306 -N -i my_private_key server_b_user@server_b_ip
 

This way, when we try to connect to localhost at port 33066, the request will be forwarded to remote server 3306 using the ssh protocol. We can test if we can connect using telnet (you can otherwise use the netcat command):

To connect from your nodejs script or you php file, simply you have to configure a mysql connection to use the local address and port used above, i.e 33066, and then specify the remote mysql user credentials as if you were connecting from withing the remote server: ie localhost:3306:

nodejs example:

php example:

Probably you noticed that our nodejs/php clients don't know about any ssh tunneling, that's what we must always aim at πŸ˜€

Ensuring the tunnel is always up

 This command is useful when you launch it but once you exit the terminal or reboot your computer/server, the tunnel will be down, your connection is broke and you have to launch it again. In order to keep it running and reliably ensure it gets restarted when necessary/not working properly, there is a multitude of choices and you can use the autossh package as it does the job but it requires more setup. My favorite and personal alternative is a simple cronjob that does the job pretty well. It's a cronjob that calls a small bash script every 5 minutes to check if my tunnel is properly running otherwise launch it:

  Conclusion

This is the end of this tutorial, we have seen how to create a ssh tunnel, ensure it's always UP with two examples on how we can connect to remote mysql database using nodejs/php. I hope this was helpful and if you encounter any kinds of problems or you do have suggestions for improvement, your comment is highly welcome. The most important thing to remember here is that there is always light at the end of the tunnelπŸ˜ƒ

 

Sunday, 10 January 2021

Do you have a google sheets that contains thousands of urls that you want to get the short version of? Do you want to get it done in a matter of seconds? Do you want to implement this without a big hustle?  If so keep reading.

Resume 

This is a step by step guide to implement bulk urls shortening directly from Google Sheets, based on the TinyURL shortening service and using google apps script (even though, the process is relatively similar to any other url shortening services like Bitly or Ow.ly ). I will briefly describe the backstory but feel free to jump straight to the process.

Backstory

I have a friend who owns a relatively small website and he had a blackfriday campaign to launch. During this campaign, he wanted to share the urls that display his products on social media and additionally, he wanted to mail them to his customers The problem was that these urls are spam-my-looking .ie: http://friendwebsite.com/shirts/filters/promotion=50&offer=flash&start=(date)&end=(date)&customer=%20%new%20%&utc=blackf . Since I was the computer guy, he asked me to handle the bulk shortening.

Before jumping to the steps, I would like to define the technologies being used and specify the requirement for this tutorial.

Definition:

  • Google Sheets: is an online spreadsheet app that lets users create and format spreadsheets and simultaneously work with other people.
  • Google Apps Script: is a rapid application development platform that makes it fast and easy to create business applications that integrate with Google Workspace using JavaScript
  • TinyURL: is a URL shortening web service, which provides short aliases for redirection of long URLs

Prerequisites:

A Google Sheet with two columns: one that contains urls to be shortened, the second -an empty one- will contain the end result.

sample spreed sheet for url shortening

Since GAS is a JavaScript platform, a familiarity with J is recommended. Finish the talk, let's get to business ⚔️.

 Steps:

1-Access the script editor:

Action: click on "Tools > Script editor":


  

Explanation: Here we will access the GAS editor. By default, it opens a Code.gs text file. This file contains a generic function called 'myFunction' that we don't need, and you can either keep it there or remove it entirely. The key point here is that this Code.gs is tied to our sheet so the function we will be developing is usable from inside the sheet like any other formulas. 

google-app-scripts-editor

 

2-Implement the url shortening function:

Action: create a new function called 'tinyurl_getShortLink' with the following code:

Explanation: This code snippet might be pretty self explanatory to JavaScript guys, but I would like to explain it to those without a web programming background. This code snippet takes the url that we want to shorten, referenced by the variable name 'url'. Since we are using TinyURL service, we will be using an endpoint which it exposes to generate a short url from a long one. This is the HTTP endpoint that we will be using: http://tinyurl.com/api-create.php?url=our_long_url_needs_to_be_injected_here (free to use πŸ˜ƒ)

πŸ’‘  An important note here is that we need to properly encode the url (Line 6) to eliminate eventual problems related to the url's structure.

  Once the url is properly encoded, we do the HTTP request to TinyURL to get the short version. This is done using the  UrlFetchApp service (which is exposed by Google Apps Script environment), and get the result: If the request was successful (we got a 200 as result code), we return the shortened url, otherwise, we return that "An error occurred", as simple as that. Once finished, we hit save, give the project a name of our choice and we are good. Here is my azaytek_short_url project's screenshot:

url shortening code snippet - Google App Scripts


3-Apply our new formulas, i.e function to the sheet's cells:

Action: apply the new formulas .i.e: tinyurl_getShortLink to the sheet.

Apply url shortening Formulas - Google App Scripts
 

Explanation: In the previous step, we created our proper formulas to shorten urls using the Google Apps Script and it became usable from inside our Google Sheet, similar to any native formulas. Now, we will be applying it to the cells of our choice. We have to select the cell that should contain the sort version, type: =tinyurl_getShortLink(LONG_URL_CELL), and hit ENTER. YADA! We get the short version right in there! 

πŸ’‘  The 'LONG_URL_CELL' placeholder is a reference to cell that contains the origin url.

 
shorten-url-google-apps-scripts

If we would like to apply the formulas to all the rows, we only have to select the cell that was already generated, mouse button clicked and scroll down until we reach the end of the file. We will have a loading in place (This is our function being executed) :

shorten-url-google-apps-scripts

bulk-shorten-urls-result-google-sheet


Conclusion

This is what it take to implement a bulk url-shortenering from inside Google Sheet using Google Apps Scripts and TinyURL service. If you found this tutorial helpful, feel free to share it. Thank you and stay tuned for my next article: Web scrape like a pro using Google App Scripts and get IP rotation for free!


 

Saturday, 2 January 2021

azaytek 2020 review

This is my first blog post after a lot of procrastination, (exactly, it has been 3 years since I wanted to start my personal blog). Beside surviving 2020, I will be going through the headlines of the year; starting by a brief description of my work environment, the soft and hard skills I learnt and I will finish by stuff I learnt outside work.

 Decade

This year started quite well; 7 months after being recruited by Decade, I had my first internal evaluation meeting. It went well, both me and my managers were satisfied with the output. Shortly after, the pandemic hit us in Tunisia and the lockdown was imposed by the authorities. This is when things got interesting. I had to adjust to working remotely and I brought my work setup at home. We had to adapt.

During those hard times, I'd been noticing how people lose their jobs everywhere. At Decade, the attitude was the exact opposite: not a single job was lost due to the COVID-19, new recruitments were made, online animation events were planned and took place remotely, we took the opportunity to sharpen and learn new skills and projects were running. This was quite satisfying and refreshing πŸ‘.

Skills I learnt at Decade:

Soft skills

1- Communication: yes you read that right. As a fresh recruit at Decade, I recognize myself as an average if not a bad communicator, and this was caused by the lack of exposure and communication where I worked before. Practice makes perfect and with more exposure to clients, meetings and presentations, I evaluate myself as twice a better communicator as before. I feel it everyday. 
 
2- I had the opportunity to be a mentor: This was my first mentor-ship ever. New experiences are always good, especially a remote one. What I realized is that everyone, no matter his experience level, knows something that you don't know. I learnt to be more patient with myself and with others.

Hard skills:

Being a Magento team member, I had the opportunity to sharpen my Magento 1.x skills, dive deep into its ecosystem, and learning Magento 2 along the way since the later was deprecated by the end of June 2020. Not a big fan of the e-commerce framework, I just wanted to mention it here because it's the main framework used by the team I'm a member of. Nevertheless, I had the opportunity to deepen my understanding of how does the web operates at different levels, and I leveled up my software release's skills and knowledge. This was the achievement that I'm the happiest with.

I've been involved in multiple challenging projects: wore the full stack's hat despite being in love with servers, devops and the cloud ecosystem. I had the opportunity to work with multiple tools and frameworks that I already used before but now with more depth and mastery. I'm not going to mention them here since they might get jealous if I forget to mention any of them. Sharpening my php/js/scripting skills is the only detail that I want to mention here because it all goes down to bare languages.

 Side journey

Having a 40-hour workweek, a football addiction, a fiancΓ©e, a family, a band of demanding friends and a body that needs rest, it was not easy to find time for exploration and personal improvement. I'm proud of the below list, as short as it is.

1- Started my personal blog: started at December 31, -3 month after buying the domain name-, this is most recent, the less effort demanding and yet the most that I'm proud of. Let's hope that P R O C R A S T I N A T I O N's era is over πŸ˜€

2- Learning web automation: fan of nodejs/JavaScript and looking for something with which I can build cross platform desktop apps; I found electron, somehow along the way, I came across nightmarejs. Compared to Selenium and phantomjs, I highly liked how easy and pick-able nightmare was. Played with it until I found google's puppeteer. Since then, I've been using it for multiple hobby projects hosted both locally and on Heroku.

3- Learning Google App Script: I was passively browsing dev.to. I came across GAS and how it improves productivity. Searched the term and it was a google product. It's a platform that makes it so easy (easy for people who have a JavaScript knowledge) to build add-ons for google products to automate mundane tasks, and I used it to implement multiple personal workflows.

4- Bought a new personal setup: this is the new setup that I built and bought, I gave it "Borni" as a name and will be the subject of a future post.

5- Learning android app network sniffing: Finding myself doing the task of checking if there is a flash 4G gift on my sim card provider ( Orange ) android app over and over again, I decided to somehow automate it. Searched for android app sniffing and there were a lot of misleading articles and tutorials all over the internet, either deprecated, not working with ssl encryption, paid stuff or simply not working. After multiple trial and error I managed to get it to work. Found the request, params, got the response and I automated the http request invocation using nodeJs initially and then using the tool mentioned below. This is the only project that my father could understand and I deserved "Good job son, can we have dinner now? I will publish the details as a separate blog post soon.

6- Learning n8n: Thanks to dev.to, I stumbled upon n8n. Searched the term and I'm glad I did. It is the most simple-installation, ease-of-use, automation framework I tested. For the non tech-savvy guys, it's a software that let's you build automated workflows without writing a single line of code and its main strength is connecting APIs and internet web services all togethers. For the tech-savvy guys, think of it as a light Talend, but based on nodejs, a lot simpler and yet more powerful in orchestrating APIs and web services.

 
This was my 2020 year's review, and this is the first official blog post I ever wrote. Since it's 2021's second day, I could think and prepare my roadmap for the new year ( I have things in mind like learning TypeScript, consistent blogging and moving azaytek to gatsby + aws since I've just activated my 12-months free trial 😻 to refresh and sharpen my devops knowledge ) but I prefer going with the flow. I'm a software engineer, i.e I procrastinate by default.