top of page
  • Writer's pictureMaham Haroon

Transferring a Website from Wix to AWS [Part - 1 Static Content]

Architecture of static website hosting on AWS
Overview of the architecture on AWS

The "Why"?

Wix and AWS are distinct platforms with different target audiences, making the process of migrating a website between them less than ideal and often resulting in some unavoidable downtime.

Considering this, I made the decision to move one of my websites from Wix to AWS based on several compelling factors:

  1. Reduction in Management Overhead: As I was already building an application on AWS and planning to utilize a significant number of AWS resources, consolidating everything in one place appeared to be a smarter approach. This consolidation simplifies various processes and minimizes the management overhead associated with maintaining separate platforms.

  2. Enhanced Security and Control: While Wix offers certain conveniences, it lacks the level of customization and comprehensive security features that AWS provides. By migrating to AWS, I gained greater control over the security aspects of my website and could leverage the advanced security measures offered by AWS.

  3. Cost Effectiveness: In my specific case, I had access to AWS credits that further incentivized the migration. Even without these credits, AWS proved to be a more cost-effective solution for my website, especially considering the scalability and flexibility offered by AWS's pricing structure.

By migrating from Wix to AWS, I aimed to optimize the management processes, enhance the security posture, and leverage the cost advantages available. It allowed me to consolidate resources, benefit from AWS's robust security features, and take advantage of potential cost savings.

The "How"?

The planning phase of the migration process was relatively straightforward, albeit accompanied by a few challenging nuances that required the utilization of less-than-optimal methodologies.

Before initiating domain transfer

Initially, it was necessary to retrieve and preserve all the website data, including contacts and other relevant information. However, Wix does not provide a straightforward method for downloading pages and comprehensive website data. To overcome this limitation, I employed the utilization of WGET as an alternative solution.

Downloading webpages using wget

Wget can be used to download html pages from a website: On MAC wget can be installed via homebrew:

brew install wget

The provided boilerplate code for wget is suitable for downloading public pages, but it requires manual execution for each individual page. This becomes cumbersome when dealing with a large number of pages. In such scenarios, it is advisable to develop a concise recursive script that automates the process, facilitating the efficient retrieval of content from multiple pages.

wget \
     --recursive \
     --mirror \
     --level 3 \
     --no-clobber \
     --page-requisites \
     --adjust-extension \
     --span-hosts \
     --convert-links \
     --domains \
     --no-parent \

I successfully downloaded all the required pages from the website. Fortunately, the majority of the static content consisted of public pages, which simplified the process.

Data download

Once the webpage downloads were complete, I proceeded to capture screenshots and download the relevant databases associated with the dynamic pages. Additionally, I copied the necessary code snippets for the logic implemented on these pages.

Furthermore, I ensured the retrieval of all contact information and user data information to guarantee a comprehensive backup of essential data.

Record retrieval

I also saved all the associated domain records from Wix, specifically the MX records including priority and TTL values.

Domain Records for Wix domains
Domain records on Wix

Uploading content to S3

I created a new bucket using my domain name, taking into consideration that S3 bucket naming with a dot is only recommended for buckets used for website hosting.

Initially, I refrained from modifying any of the HTML pages, although I anticipated the need for future modifications to accommodate specific processes.

I uploaded the original HTML files as they were and enabled static website hosting on the bucket, setting the default page to index.html file.

Considering the nature of the website, utilizing the intelligent-tiering feature for the S3 bucket is advisable if the data access patterns are not well-defined, I stayed with standard configuration.

I disabled the ACL options as recommended, and also disabled block public access since I was testing the web content for public availability.

Furthermore, I enabled versioning on the bucket to maintain a history of changes, while object lock was not enabled as I anticipated content modifications within the bucket.

Lastly, I added a simple policy to the bucket, granting read permissions to everyone, ensuring appropriate access control.

        "Resource":"{arn of my bucket}/*"

The objects within the bucket were now accessible via the internet using the designated link provided by S3 bucket, confirming the successful functionality of the S3 infrastructure.

For the dynamic pages, I employ RDS, Lambda, DynamoDB, and API Gateway, but further details regarding their integration will be discussed in the subsequent part of this post.

It is worth noting that the pages downloaded from Wix required adjustments and changes. Therefore, I modified them as required and uploaded them to S3, taking advantage of the versioning feature to ensure efficient content management.

Creating a CloudFront Distribution

Subsequently, I proceed to create a new CloudFront distribution. Initially, I opt not to enable any Enhanced Security features like WAF. Instead, I create a simple distribution and specify the S3 bucket endpoint as the origin for my domain.

Most other options aren't of consequence but I find it a good idea to choose the redirect HTTP to HTTPS option.

Once the origin is added, minimal additional configuration is required. I verify the functionality of my website by testing the provided link, confirming its accessibility.

To enhance the URL structure, it is beneficial to rename files in the S3 bucket, removing the ".html" extension. This enables accessing the files through a more streamlined link format, such as "{cloudfront domain name}" instead of "{cloudfront domain name}".

Moreover, it is advisable to configure an error page to handle situations when the website is temporarily unavailable or other issues arise.

At this stage, it is also recommended to update the S3 bucket policy to restrict access solely through the CloudFront distribution.

Apart from these tasks, there were no further immediate actions required concerning CloudFront.

Using a test domain

I purchased an additional domain through Route 53 to facilitate testing the website on this new domain. Route53 provides the capability to search for available domain names, allowing you to acquire the desired domain in Route53. Subsequently, I created a public hosted zone for that domain, where essential records such as NS, SOA, A, MX, and CNAME are stored.

Creating an Alias for Cloudfront by linking to Route53

It is preferable to have a domain name associated with your content rather than using a random CloudFront distribution. This allows customers to access your content using your registered domain name via Route53.

To achieve this, the first step is to obtain a Public SSL Certificate from ACM. During the certificate creation process, you need to specify a subdomain such as or * to cover all subdomains. To verify domain ownership, you have the option to choose between DNS Validation and email verification. In this case, DNS Validation proved to be a straightforward method. After creating the certificate, you simply need to add a DNS record in Route 53 to confirm domain ownership.

Next, proceed to edit your CloudFront distribution and add an alternate CNAME (e.g., to the distribution. Additionally, associate the certificate generated in ACM with the CloudFront distribution.

Finally, navigate to Route 53 and within your hosted zone, create an Alias A record with the name that points to your CloudFront distribution.

Keep in mind that it may take some time for the changes to be fully deployed and propagated. However, once everything is in place, you can access your content via CloudFront using your domain name.

At this point, everything appears to be in order, and you can focus on editing the HTML pages to incorporate the desired functionality. This may involve addressing dynamic content concerns and configuring the logic for Lambda and API Gateway. It is recommended to complete these tasks before transferring the domain.

Transferring Domain

Assuming the dynamic content setup is complete, proceed with selecting the "transfer out" option for the Wix domain. Wix will provide a code which should be saved. You'll encounter multiple confirmations and warnings, confirm and proceed with the process.

Domain transfer Option on Wix
Domain transfer option on Wix

Next, navigate to Route53 and check if the domain can be transferred to the registrar and is no longer locked. During the transfer process, you will be prompted to provide the code provided by Wix.

The transfer is expected to take approximately 7 days to complete. However, since all other configurations are already in place, the subsequent steps should be straightforward.

Transferring in to AWS Route53
Transfer In option in Amazon Route53

Once the transfer is successfully completed, there will be multiple emails, add a public hosted zone to Route53 to accommodate the new domain. Within the hosted zone, locate the NS records and make a note of them. Proceed to the registered domain and update the domain information. Add the new NS records that were obtained and remove the previous Wix NS records, as they will no longer function. Please note that it may take up to a day for these changes to propagate, depending on the TTL value for NS records.

Additionally, ensure that the previously stored MX records are also added within this domain configuration.

Moving forward, obtain an ACM Certificate for the subdomain you wish to use, if not done already. During this process, you may need to select email verification as the validation method. Once the certificate is issued, configure your CloudFront distribution to utilize the CNAME associated with the new domain, and associate the issued certificate with it. Finally, add an alias A record for the CNAME within the Route 53 hosted zone.

Given that all other necessary configurations are already in place, you should be ready to go as soon as the new NS records are fully propagated.

39 views0 comments


bottom of page