Working with AWS Cloud from CommandLine — Part5

Sunny Goswami
5 min readNov 22, 2020

In this article, we will learn how to host a web application on a webserver configured on aws cloud.

Before we proceed further, lets take a look at what are the prerequisites, for this task. The Prerequisites for this task are:

  • An AWS Account.
  • AWS CLI pre-configured in the system.

To know if you have AWS CLI configured in your system, run this command in command prompt or Terminal:

Command: aws --version

If the output is command not found or this command is not recognized as internal or external command, that means aws cli is not configured in your system.

Correct Output:

If you do not have aws cli configured in your system, follow the Part-1 of this series. Note: In Part-1, please specify the region in aws cloud, you are going to work in.

Lets get started.

Lets Suppose we have a web application ready, and we want to host it on aws cloud. For this, we will follow some steps, which are as follows:

Step 1: Launch an ec2 instance.

I have explained this step in detail in Part-2 of this series, follow that article to understand how to launch an ec2 instance on aws cloud.

You can choose any ami, to launch the instance from. I have done this task in Amazon Linux 2 AMI because it comes with pre-configured yum repository.

Step 2: Installing the Apache httpd Webserver in the instance.

We will use Apache httpd webserver to host our web application. To install this server, run command:

Command: sudo yum install httpd -y

Step 3: Copying the Web application pages into webserver.

The document root directory is the directory, which stores the web application data such as webpages. Webserver, picks up the files from this directory to host.

In case of Apache httpd, the document root directory is /var/www/html. It comes bydefault configured to be the root directory for the httpd server.

So we have to copy the entire web application data into this directory. For this task, i will be creating a simple html webpage named as index.html and I will copy it into this directory.

Step 4: To start the httpd service.

For this, the command in Linux is:

Command: sudo systemctl start httpd

To verify if the httpd service is running, use command:

Command: sudo systemctl status httpd

And that’s it, the application is live.

Access the webpage, using curl from within the instance, at: http://localhost/index.html or http://InstanceIP/index.html

The IP inside the red marked area is the instanceIP.

To access this webpage in browser, we will have to use instance public IP address. In this case, the format would be http://52.66.128.95/index.html

But, if we try to access this page, in browser, we cant access it.

The reason, we are unable to access the webpage is:

Security Group: It is an aws virtual firewall, that regulates the inbound and outbound traffic to the instance.

Lets check what rules are allowed in the security group attached to the launched instance, presently:

We can see, in inbound rules only port 22 is allowed, which belongs to SSH Protocol. To allow http protocol access for port 80.

Command: aws ec2 authorize-security-group-ingress --group-name=<> 
--protocol=<> --port=<> --cidr=0.0.0.0/0

We can confirm this through either GUI

or CLI using command:

Command: aws   ec2   describe-security-groups 
--group-name=<security_group_name>

Now, if we try to again access the webpage, in browser

It works!!

There are some things to understand in this configuration of web server.

  1. We have not attached a persistent storage to this instance. In case, if the instance gets terminated, then entire web application data along with the instance, would get deleted. This type of storage is called ephemeral storage.

To understand in detail, about ephemreral and persistent storage, refer the Part-3 of this series.

Follow the next part in the series to understand how to make our web application data persistent and also how to integerate Amazon Simple Storage(Aws S3) with the webserver for static data storage.

📝 ️Few Last Words

→ I will be sharing many article’s on Integrating multiple tools and technologies Cloud Computing, DevOps, Big Data Hadoop, Machine Learning etc. in Upcoming days .

→ Follow me on Medium for more Article’s on Research based and integration of new new tools and technologies .

For Furthur Queries, Suggestion’s Feel Free to Connect with me On Linkedin .

For any doubt or query, let me know in the Response section!

If you like it then Clap & Share ..

Thank you EveryOne For reading .!! | arigatou gozaimasu. sayonara!

--

--