How I Successfully Hosted a Website on Amazon S3 πŸš€

Hosting a website on Amazon S3 was an exciting journey for me! In this guide, I’ll share every detail of how I did it, the challenges I faced, and the creative solutions I used. By the end, you’ll not only know how to host a website on S3 but also integrate other AWS services to take your project to the next level.


🌟 Why Amazon S3?

Amazon S3 is a fantastic choice for hosting static websites, and here’s why I chose it:

  • Highly Scalable: Automatically handles traffic spikes.
  • Cost-Effective: You pay only for the storage and data transfer you use.
  • Integrated with AWS Services: Works seamlessly with AWS CloudFront, Route 53, and more.
  • Global Reach: Delivers your content with low latency.

Step 1: Prerequisites

πŸ› οΈ What I Needed:

  1. AWS Account: Sign up here.
  2. Domain Name: (Optional but recommended for branding). I used Route 53 for DNS management.
  3. Static Website Files: HTML, CSS, JavaScript, and assets.

Tip: Keep your files organized in a single folder before uploading.


Step 2: Creating an S3 Bucket

Here’s What I Did:

  1. Went to the S3 Console.
  2. Clicked on Create Bucket and set up the following:
    • Bucket Name: I used a globally unique name like brians-portfolio-site.
    • Region: Selected a region close to my audience.
  3. Disabled Block All Public Access (essential for website hosting).
  4. Enabled Bucket Versioning for rollback and backup purposes.
  5. Clicked Create Bucket.

Mermaid Diagram of the Process

Challenge: The hardest part was understanding public access. AWS defaults to blocking public access, which is great for security but required additional configuration for website hosting.


Step 3: Uploading My Website Files

Steps I Followed:

  1. Opened my bucket from the S3 Console.
  2. Clicked Upload and selected my website files.
  3. Ensured the files had public-read permissions.
# AWS CLI Command to Upload Files
aws s3 sync /local-website-folder s3://brians-portfolio-site --acl public-read

Tip: Using the AWS CLI made uploading faster and easier, especially when making updates.


Step 4: Configuring Static Website Hosting

What I Configured:

  1. Navigated to the Properties tab of my bucket.
  2. Enabled Static Website Hosting.
  3. Set the following:
    • Index Document: index.html
    • Error Document: 404.html
  4. Saved the configuration.

Logic Flowchart

Challenge: Initially, I forgot to upload the 404.html file, which caused errors when accessing invalid URLs. Always double-check your error document!


Step 5: Bucket Permissions

To make my website publicly accessible, I updated the bucket policy.

Bucket Policy:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": "*",
      "Action": "s3:GetObject",
      "Resource": "arn:aws:s3:::brians-portfolio-site/*"
    }
  ]
}

Tip: Use the AWS Policy Generator to create custom policies easily.


Step 6: Setting Up a Custom Domain

Using a custom domain made my website look professional.

Steps I Took with Route 53:

  1. Registered a domain via Route 53.
  2. Created an A Record pointing to the S3 bucket.
  3. Used Alias Target for seamless integration.

Example Route 53 Configuration:

{
  "Name": "www.briankimemia.com",
  "Type": "A",
  "AliasTarget": {
    "HostedZoneId": "Z3AADJGX6KTTL2",
    "DNSName": "s3-website-us-east-1.amazonaws.com",
    "EvaluateTargetHealth": false
  }
}

Step 7: Adding HTTPS with CloudFront

To secure my website, I integrated CloudFront for HTTPS support and caching.

What I Did:

  1. Created a CloudFront Distribution.
  2. Set the Origin Domain Name to my S3 bucket.
  3. Enabled Redirect HTTP to HTTPS.
  4. Attached an SSL Certificate from AWS Certificate Manager.
# Example CloudFront Command
aws cloudfront create-distribution --origin-domain-name brians-portfolio-site.s3.amazonaws.com

Challenge: Configuring SSL was tricky initially, but AWS Certificate Manager simplifies the process.


Step 8: Enhancing with Other AWS Services

Here’s how I further enhanced my website:

1. Amazon Lambda

  • Used Lambda to process form submissions via API Gateway.

2. Amazon DynamoDB

  • Stored user feedback and data collected from the website.

3. AWS Amplify

  • Experimented with Amplify for automated CI/CD deployment.

4. Amazon CloudWatch

  • Monitored website performance and traffic.

Lessons Learned

  1. Permissions Matter: Misconfigured permissions can make or break your setup.
  2. Start Small: Begin with simple configurations and gradually integrate advanced features.
  3. Document Everything: Keeping track of settings saved me a lot of headaches.

Final Thoughts

Hosting a website on Amazon S3 was a rewarding experience. The integration with other AWS services like CloudFront and Route 53 elevated my project to a professional level. If you’re planning to host your site on S3, don’t shy away from exploring the vast AWS ecosystem.

Next Steps: Try adding AWS Lambda or API Gateway to bring interactivity to your static site!


Resources

πŸ“§ Contact

For questions or feedback, reach out:

πŸ“¨ Email: projects@briankimemia.is-a.dev 🌐 Portfolio: Brian Kimemia GitHub: BrianKN019


Thank you for exploring this project! Let’s innovate and build secure AWS solutions together. πŸš€