vrijdag 11 november 2016

Jira > Confluence API > Getting page content

To find a page in confluence via title name and space key:
host/rest/api/content?title=Title%20Page&spaceKey=MYSPACE
This will typically get you the ID of the page.

With the ID it's much easier to find the page. The API call to get the page and all data of it goes like:
host/rest/api/content/{id}?expand=space,body.view,version,container

C# > Modifying xml files

I have been neglecting my blog for way too long again. The short term and adhoc is once again getting in the way of the long term learning. Often test data comes in the way of XML files, and often that data needs to be modified to suit your needs or stay up to date. Recently I have been given the task of enriching existing test XMLs with an extra node. To do this manually would be possible but error-prone and mind-numbingly boring. Let's investigate how we can enrich the test XMLs with this new node automatically.

Scenario 1: we have an .xsd of the test XML which we have converted into C# classes giving us the ability to deserialize the XMLs into C# objects and modify them accordingly.

And... It was actually so easy and trivial to enrich the document via this method it ended the blog post right there.


vrijdag 7 oktober 2016

C# > October > DUH! Moments

Time for a new month of facepalms!


  • GUID stands for...
    • Globally Unique Identifier
  • In Visual Studio:
    • The key combo for 'Quick Actions' is: CTRL + .

C# > September > DUH! Moments


  • When working with I/O, Streams, Writers, etc. ALWAYS use 'using{}'
  • In regex the '?' makes the preceding item optional:
    • Or: '?' in regex means '0 or 1'
    • An? (matches both A and An)
    • 23(rd)? (matches both 23 and 23rd)
  • In SpecFlow: when using regex step definition bindings you can create step definition methods with multiple parameters easily by having multiple (.*)s in the regex.
  • C#/.NET syntax for pattern matching:
    • string pattern = "[regex pattern]"
    • string input = "[input]"
    • Match match = Regex.Match(input, pattern);
    • AssertTrue(match.Success) etc.
  • Useful methods of DateTime objects:
    • myDateTime.AddDays(int days)
    • myDateTime.AddMonths(int months)
    • etc.
  • In Visual Studio: when debugging the immediate window is an amazing section where you can type statements which will be immediately executed and the value shown.
  • A class containing [Fact]-s for xunit.net should be public
  • ?? operator
    • if/else operator with nullable argument
  • LINQ:
    • var query = from element in collection select element.attribute
    • var query = collection.Select(element => element.attribute)
  • In regex:
    • '+' means '1 or more'
    • '$' means 'end of string or end of line'
    • '\d' means 'digit'
  • Working with the JIRA REST API and the JIRA Zephyr API (ZAPI) I've been making my very first steps in web-based APIs. So there are a LOT of "DUH!" moments. The most newbie one so far:
    • We have GET requests, and POST requests.
      • GET requests:
        • Have no body
        • Arguments to the request are passed via the URL.
          • Example: /api/latest/resource?argumentName=argumentValue
      • POST requests:
        • (Can) have a body
        • Arguments to the requests are typically passed via a body (such as JSON)
    • If you don't follow these rules you get fancy error messages such as:
      • "Cannot send a content-body with this verb-type".
    • The different calls and what they mean:
      • GET: read
      • POST: create
      • PUT: update
      • DELETE: delete
      • (apparently there is PATCH as well - no idea what it is)

woensdag 21 september 2016

MTM > Microsoft Test Manager > First Steps

I will have to setup a test automation framework at my new job and considering the fact that my new company essentially uses the entire Microsoft stack I'm going to see if Microsoft Test Manager (MTM) will suit our purposes.

Long live the Pluralsight courses:

  • Structure:
    • Test Plan
      • Test Suite
        • Test Cases
  • Test Plan Configuration
    • What are you testing?
    • How are you testing it?
    • Test Plans are now TFS Work Items
    • Inside the Test Plan:
      • Plan
      • Test
      • Track
      • Organize
    • Plan section:
      • Contents
      • Results
      • Properties
    • Test Plan Properties (& Run Settings):
      • Settings
      • Environments
      • Assigned Builds
      • Test Configurations
      • Run Settings:
        • Test settings:
          • General
          • Roles
          • Data and Diagnostics (there is a LOT here)
          • Summary
        • Test environment
          • This leads to the 'lab section' of the MTM which is out of scope for the course. TODO: Lab section MTM.
        • Builds:
          • Information about against which build the tests are ran.
        • Configurations:
          • Operating System
          • Browser etc.
          • If multiple configurations are applied to the Test Plan then a single Test Case will be 'created twice' - one for each of the configurations.
      • Customizations
        • Test Plans are now considered TFS work items, this has all sorts of handy consequences. Such as:
          • Work Item
          • Fields & States
          • History
        • TODO: Lean about Work Items.
          • For example: make changes to the Work Items states with the help of Windows Powertools.
    • Track section:
      • Here we can query for Work Items (and therefore also Test Cases etc.)
    • MTM Web Interface: with the correct licenses there is a really handy Web Interface in the case you require MTM on machines that don't have MTM installed.
  • Plan: Create Test Suites and Test Cases
    • Test Suites
      • Types:
        • Static
        • Requirements-based
        • Query-based
      • Out-of-the-box states:
        • In Planning
        • In Progress
        • Completed

    • Aside:
      • Test Cases are now available as TFS work items
    • You can add Test Suites to the Test Plan
      • Be they Static, Requirements-based, Query-based, etc.
    • You can then add Requirements to the Test Suite
      • Doing so will open a window in which to query for TFS requirements
    • Test Cases
      • Are TFS Work Items with all the accompanying benefits
      • Consist of Steps
      • Can be assigned to Testers
      • Can have their own sets of configurations
    • Test Points
      • Test Cases x nr. of configurations
      • Test results are tied to a Test Point
    • LEARNING MOMENT
      • In MTM:
        • ADD stands for adding something existing
        • NEW stands for creating something... new(!)
    • END LEARNING MOMENT
    • Creating Test Cases:
      • Steps
      • Summary
      • Tested Backlog Items
      • Links
      • Attachments
      • Associated Automation
Aannnnnnnd... it's gone.
If the goal is to fail hard and fail fast we're not doing too badly, because after investing roughly a day into MTM it has become clear that this is currently not an efficient tool for my new company. So we're dropping the course and this line of MTM investigation. Next step: Zephyr for Jira!

zondag 7 augustus 2016

TFS > Connecting to TFSVC on your Visual Studio Team Services account

The future is in the cloud. So as we try to get a grip on TFSVC let's learn how to get a grip on it in the Microsoft cloud dedicated to this: Visual Studio Team Services.

What we want to achieve:

  • Set up a Collection (or at least a Team Project) in VSTS
    • Create a Team Project in this Collection
  • Add an empty Solution to the Team Project
  • Add existing code to the Solution
  • Perform generic version control operations on the Team Project/Collection


Set up a Collection (or at least a Team Project) in VSTS:
For guides/instructions we use - amongst others - the page:
https://www.visualstudio.com/docs/overview

Firstly create a Visual Studio Team Services account as described in 
https://www.visualstudio.com/en-us/docs/setup-admin/team-services/sign-up-for-visual-studio-team-services

Then 

donderdag 4 augustus 2016

TFS Version Control > First Steps

At my new company TFVC is used for version control on the source code. For me to be able to function in the development team I need to get a grip on TFVC.

The hierarchy of TFVC (from big to small):

  1. Collection
  2. Team Project
  3. Solution
  4. Project
A realization:
  • The Source Control Explorer 'explores' the Workspace.
  • The Workspace is the LOCAL copy of the Team Project - within a Collection - stored in TFS Version Control
  • The Workspace is generally identical to the files in the directory which was mapped to the Workspace BUT it is possible for there to be files in the file system which are not added to the workspace.
    • Add those 'missing' files via Visual Studio (Source Control Explorer, rightclick->Add to Source Control, etc.)
Facepalm moment:
When creating a new project (File > New > Project) the popup has four fields to fill in:
  1. Name
  2. Location
  3. Solution
  4. Solution name
Solution has a dropdown menu with three menu items:
  1. Create new solution
  2. Add to solution
  3. Create in new instance

START LEARNING MOMENT
When creating a new project WITHIN an existing solution then use the menu item 'Add to solution' as described above
END LEARNING MOMENT

dinsdag 2 augustus 2016

New job, new priorities

So we changed jobs. Still Test Automation, but different colleagues (internal developers yay!) and different tooling. Out with the Tosca, in with the .NET technology.

So the new goal is: let's learn how to work with the Microsoft technology stack.

Very useful link:
https://www.visualstudio.com/en-us/docs/tfvc/overview

We start with creating a solution and then adding it to TFVC:
https://www.visualstudio.com/en-us/docs/tfvc/set-up-team-foundation-version-control-your-dev-machine

But then we run into the situation that apparently it's not possible to connect to MULTIPLE Team Foundation Servers from one Visual Studio client. This is rather annoying since I was planning to do some experimenting on my own in the VisualStudio cloud separate from work, but this is apparently not possible. My manager created a new 'Team Project'

Current technical roadblocks to resolve:
1. Connect to the work TFS via VPN.

dinsdag 28 juni 2016

Tosca > TDM > Setting up

When working with Tosca sooner or later the need to store testdata in a database will arise. In Tosca this is supported via the 'TDM' functionality.

Before you can start writing testdata to a TDM database you'll need to create a 'class' in the TestCaseDesign section and assign attributes to that class corresponding to your TDM needs

  • DO NOT FORGET to tick the 'TDM' column tickbox for every attribute which needs to make it into your TDM database.



Find the relevant modules in the section:
Modules\Tricentis\Tbox\Tbox XEngines\TestDataManagement



First create an object with the 'TDM Create Object' module.
Then set attributes with the 'TDM Set Attribute' module.
Then save the object to the TDM database with the 'TDM Save' module.

To retrieve an entry from TDM use the 'TDM Find Single Object' module. After the call an object with the given name is stored in memory. To access the TDM values of this object use the 'TD' function in Tosca:
{TD[[ObjectName].[AttributeName]]}


maandag 6 juni 2016

Tosca > TDM TQL search within a search

In our repository we have a TDM structure where the different TDM objects are linked with reletions. If you want to find one property via its link with another property this is done via a TQL query:

->RETURN All:[TDMObject1]->[TDMObject2][Property=="[property_value]"]

or: in human readable:

-> RETURN All:Portfolio->Client[Username=="login01"]

woensdag 18 mei 2016

Tosca > Report detail level

If the logs in your ExecutionLog don't contain enough details, then set:
Settings -> TBox -> Logging -> Report successful Execution of

to the value

TeststepValues - All

dinsdag 10 mei 2016

Tosca DUH! moment: May

Test Automation is - in the end - software development. Everyone who has ever dabbled in software development is familiar with being stuck on things that in the end turned out to be very trivial and make you feel like: DUH!. I'll be logging my Tosca DUH! moments in this post. It'll be good as therapy in any case, and seeing all these items listed under each other might provide me with some insights.


  • If you have a verification error make sure you read the ENTIRE 'Loginfo' message and not be fooled by the fact that the textbox can be many times smaller than the actual message. /Facepalm
  • Was until now unaware of the distinction between single and double quotes in Powershell.
    • In short: just use single quotes when possible
    • In long: http://windowsitpro.com/blog/single-quotes-vs-double-quotes-powershell-whats-difference

woensdag 27 april 2016

BitBucket > Setting up a Git repository

It is still my goal to have my own AWS-powered LAMP server (or Windows tech equivalent) server up and running where I'll host a Git server, mail server, web server, etc. But what I wish for and what is realistic are generatlly two very different things, so it's time to get a bit more realistic in the short term. And that newfound realism is going to start with setting up a Git repository on BitBucket so that we'll have a lightweight method to share coding projects over multiple locations.

How to:

  • Create an account at BitBucket - this should speak for itself
  • Create an EMPTY Git repository on your BitBucket account
  • Go to the empty repsository at BitBucket and...
    • click 'Clone'
    • A popup will appear containing text starting with "git clone"
    • Copy all text MINUS "git clone" to the clipboard
  • Create a new project (optionally with initial code) in Visual Studio
  • Add the project to Source Control in Visual Studio (if necessary)
  • Commit the project
    • You'll get an error message that the project wasn't added to a remote repository and a screen with an option to provide the URL of the remote repo
    • Paste the clipboard into the URL field
    • Click the obvious button and...
  • DONE :-)

Definitely can be done neater, but this works.

dinsdag 19 april 2016

Tosca > Steering Parameter > FireEvent > Finding out which one to use

If you are trying to steer a web element with Tosca and the default 'FireEvent = change' is not working you have to find out which 'event' is 'being listened to'. These are two techniques to go about that:

Chrome:

  • Rightclick the element and choose 'Inspect'
  • In the 'underwater screen' that pops up Leftclick the tag corresponding to the element
  • In the right upper section of the underwater screen there is a section with the tabs: Styles, Computed, and Event Listeners
    • Click the tab 'Event Listeners'
  • A list of events appears.
Firefox:
  • Rightclick the element and choose 'Inspect Element'
  • In the 'underwater screen' that pops up Rightclick the tag corresponding to the element and choose 'show DOM properties'
  • In the second underwater screen that pops there is a section on the right side.
    • Scroll down this section to find all 'on-' events.
  • Firefox seems less discerning than Chrome. I've had cases where Chrome found the events being listened to and Firefox didn't.

Tosca > Steering Parameter > FireEvent

A very useful item in your Tosca toolkit is the steering parameter 'FireEvent' (the green 'cubes' in the Properties tab of ModuleAttributes). By default FireEvent has the value 'change', but sometimes other 'events' are needed to have the website respond as desired. Recently I ran into a checkbox the checking of which changed an OK-button from disabled to enabled. The default 'change' value checked the checkbox without enabling the OK-button. All it took was to set FireEvent of the checkbox ModuleAttribute to the value 'click' and the OK-button was enabled.

In the case of multiple FireEvents separate them with a semi-colon. For example: 'change;blur'.

The possible values of FireEvent are:
  • onchange
  • onclick
  • ondblclick
  • onblur
  • onfocus
  • onmousedown
  • onmouseup
  • onmouseover
  • onmouseout
  • onsubmit
  • onreset
  • onpropertychange
Pay note: in Tosca the values given to the FireEvent steering parameter should be WITHOUT the 'on'-prefix.

donderdag 14 april 2016

Tosca > Ports required by the Tosca components

Several times during the building of our TA framework I've been temporarily blocked when having to find out which ports/traffic need to be allowed between Tosca components. Turns out there is a great Tosca page for it:

Search for: PORTS REQUIRED BY TOSCA COMPONENTS

https://support.tricentis.com/community/manuals_detail.do?sysparm_document_key=u_webhelp,425d827d379dd640406642f643990eea

vrijdag 1 april 2016

Tosca DUH! moment: April

Test Automation is - in the end - software development. Everyone who has ever dabbled in software development is familiar with being stuck on things that in the end turned out to be very trivial and make you feel like: DUH!. I'll be logging my Tosca DUH! moments in this post. It'll be good as therapy in any case, and seeing all these items listed under each other might provide me with some insights.

  • Fed 'Buffername' instead of '{B[Buffername]}' to the parameter field of a ReusableTeststepBlock.
    • Circumstance: the buffername was used in a query, which - with the wrong data - produced zero results and this resulted in a confusing 'invalid column name' error.
    • Mitigation: not easy. There is no easy debug mode in Tosca to quickly see all PL or B values. Will have to be mindful of this.
  • If SteeringParameter FireEvent=change doesn't do the job check if other 'events' are being listened to and set the FireEvent parameter to them / add the FireEvent to them.
  • If you want to have {Click} position the mouse-pointer somewhere else than in the middle: write a customization where you override the 'ActionPointer' property.;
  • TypeReference.IsAssignableFrom(Type c)
    • is TRUE if the TypeReference can be assigned to an instance of Type c
  • Shorthand notation to define an IEnumerable object:
    • IEnumerable<Obj> collection = new[] {Obj1, Obj2, Obj3, etc.}

woensdag 30 maart 2016

Tosca > {DATE} or {DATE[][][]} function

For some reason I ALWAYS forget what the correct syntax is with this function. So hereby a post to forever remember:

First: a good search term in the Tosca documentation:
DATE AND TIME EXPRESSIONS

Syntax:
{<EXPRESSION>[<Basedate>][<Offset>][<Format>]}

In normal human readable:
{DATE[basedate][offset][format]}

Key-letters for specifying the format:
d
days
M
months
y
years
h
hours
m
minutes
s
seconds

dinsdag 15 maart 2016

Visual Studio & Git & BitBucket: how to get started

I am not very knowledgeable when it comes to version control systems (something I really have to develop myself further in), but luckily many tools work pretty well out of the box. A very simple way to get started with VCS is to use Visual Studio's built-in-Git with BitBucket.

HOW TO:

  • Create a new empty repository in BitBucket
  • Create your to-be-version-controlled project in Visual Studio
  • Add the project to version control and choose 'Git'
  • Go to Team Explorer
    • Go to 'Changes'
    • Type 'initial commit' in the textbox
    • Click to the 'Commit' button
      • Choose 'Commit and push'
    • VS will give an error message that no remote repository is linked to the project
      • In that same error-message-screen will be a textbox where you can add the url of the remote repo.
      • Copy-paste there the URL to your BitBucket repo and...
    • Done!

donderdag 3 maart 2016

Dabbling in TTT > Powershell script to assist with development/deployment

When working on SETs/Adapter customizations/Addins one of the very annoying activities is 'deploying' your .dll to Tosca after building it:


  • Find the directory where the newly built .dll was placed
  • Copy the newly built .dll to %TRICENTIS_HOME%\Automation\Framework
  • "Oh shit, forgot to 'End Process' the 'Tricentis.Automation.Agent.exe'"
  • End Process@Tricentis.Automation.Agent.exe
  • Re-find the directory where the newly built .dll was placed
  • Finally copy the newly built .dll to %TRICENTIS_HOME\Automation\Framework
  • In the case of an Addin-deploy we even need to add killing the actual Tosca Commander client to this checklist.
Let's write a powershell script which takes care of all that. We'll have to write two scripts. One for SET/CustomAdapter deploys and one for Addin deploys. Because of the aforementioned reasons

The SET/CA deploy script is easily done... [added:] actually it was not so easily done. The script posted earlier would often generate error message because the copy-item would be called before the old .dll would truly be released. With the help of stackoverflow we modified the script to be more robust:

#IsFileLocked was literally copied from Stackoverflow
function IsFileLocked([string]$filePath){
    Rename-Item $filePath $filePath -ErrorVariable errs -ErrorAction SilentlyContinue
    return ($errs.Count -ne 0)
}
$newFile = "[fullpath of new .dll]"
$targetDirectory = "C:\Program Files (x86)\TRICENTIS\TOSCA Testsuite\Automation\Framework"
$TAA_Id = Get-Process -Name "Tricentis.Automation.Agent" -ErrorAction SilentlyContinue
if($TAA_Id)
{
Stop-Process -inputobject $TAA_Id
Wait-Process -inputobject $TAA_Id
}
Do
{
Start-sleep -m 500
} While(IsFileLocked $newFile)
Copy-Item $newFile $targetDirectory -Force

It bugs me that the above script has a hard 500ms wait in it, but I couldn't get the script to consistently work without it and won't waste time on half a second. The only caveat here is that you have to run Powershell in administrator mode. A quick Google showed it would be possible to make the script self-elevating but that's an exercise for another time. The above very simple script seemed to work fine and is a real time saver.

After updating the SETDeploy the AddinDeploy becomes trivial:
function IsFileLocked([string]$filePath){
    Rename-Item $filePath $filePath -ErrorVariable errs -ErrorAction SilentlyContinue
    return ($errs.Count -ne 0)
}
$newFile = "[fullpath new .dll]"
$targetDirectory = "C:\Program Files (x86)\TRICENTIS\TOSCA Testsuite\ToscaCommander\Addins"
$TAA_Id = Get-Process -Name "TOSCACommander" -ErrorAction SilentlyContinue
if($TAA_Id)
{
Stop-Process -inputobject $TAA_Id
Wait-Process -inputobject $TAA_Id
}
Do
{
Start-sleep -m 500
} While(IsFileLocked $newFile)
Copy-Item $newFile $targetDirectory -Force
Start-Process "C:\Program Files (x86)\TRICENTIS\TOSCA Testsuite\ToscaCommander\TOSCACommander.exe"

zaterdag 27 februari 2016

Tosca > Ideas to manage your artefacts

A fully developed TA framework based on Tosca can potentially have many artefacts, such as:
  • The Tosca version
  • The Tosca repository (One would almost forget!)
  • Tosca customizations in the form of .dll-s:
    • Special Execution Tasks (SETs)
    • Adapter Customizations
    • Add-ins
  • ODBC drivers for DB access
  • Tosca (.tcs) scripts to run Tosca via the commandline
  • PowerShell scripts which call the .tcs scripts
I am currently faced with the challenge of getting several scrum teams to start using a Tosca TA framework on a daily basis. This means that the framework userbase will increase from 2-4 TA project members which are intimately familiar with the product to 10-20 testers and devs which aren't. And, oh yeah, somewhere amidst all that a CI/CD pipeline will also be set up which will run Tosca scripts as part of its automated testing suite.

We clearly need:
  • A way to 'push' artefact updates to the (vastly increased) userbase
  • Version control on the artefacts
After a bit of research we have decided to go with the "Chocolatey, NuGet, Artifactory, OneGet" stack of technologies to try and implement this. I have next to no experience with these technologies so this should be interesting.

(Note: The technology of our company is predominantly Microsoft based (Windows desktops, Windows servers, C# .NET applications) which has been taken into account when choosing these tools.)

Update:
Ok, let's see if we can get a proof of concept going. We're going with:

  • A powershell script which calls an...
  • Executable which in turn calls
  • UniversalGreeter.dll which reads the target of its greeting from a
  • universalgreeter.config file which contains the word 'World'
Goal #1
  • Squeeze all that in one NuGet (using OneGet?)
  • Commit the NuGet to Artifactory
  • Pull the NuGet from Artifactory for a local install
  • I have as yet no idea where Chocolately comes into play here
Goal #2
  • Same as Goal #1 but with added version history
  • Including being able to install olders version (a rollback scenario)

zaterdag 2 januari 2016

AWS > First Steps


Task List
  1. Register yourdomain.com
  2. Set up an AWS Linux Server (server)
  3. Tell a/the Domain Name Server to associate yourdomain.com with the IP of the server
  4. Get a Git server running on the server
'Exploratory Testing' for the benefit of the tasks:
  • We will need to gather basic AWS knowledge before we can start to make real progress on the task list.
  • We start with AWS Getting Started:
    • http://docs.aws.amazon.com/gettingstarted/latest/swh/website-hosting-intro.html
  • Setting up to host a static website:
    • We need to sign up for AWS... check
    • Now we need to create an IAM (Identity and Access Management) user
    • AWS Management Console > IAM Management
      • Angry orange exclamation marks tell us that we need to increase the security of our AWS accounts. Let's look into that.
        • We are strongly advised to activate MFA (multi-factor authentication) on our AWS root account. We agree.
          • One angry orange exclamation mark has turned into a happy green box with a tick in it. One down one to go.
        • We set up a password policy for our IAM users and the last orange exclamation mark bites the dust. We are once more complaint. Happy days.
      • Next step: set up an IAM admin account so that we limit using the root account to a minimum... check
      • Now we can log on to the AWS Management Console via the link:
        • https://your_aws_account_id.signin.aws.amazon.com/console/
      • We customise the link so that it doesn't show the account ID and we can log on via the link
        • https://your_account_alias.signin.aws.amazon.com/console/
      • We add this link as a note in our password manager
      • We can also login directly via the password manager(!)
    • RESULT: 
      • We can now use the IAM admin account instead of the AWS root account for work on AWS.
    • God damned, not my password manager is having problems distinguishing between regular amazon login and AWS root account login. Gotta fix this.
      • Ok, fixed. We continue on.
    • Next we create the AWS buckets and set their settings per the tutorial's instruction:
      • yourdomain.com
      • www.yourdomain.com
      • logs.yourdomain.com
    • LEARNING MOMENT:
      • The tutorial is very clear on the fact that this type of S3-bucket-based hosting is for static websites only. For dynamic content (or something as a Git server) we will need to set up a virtual server.
    • Painful: we just wasted 30m trying to find the 'upload' action for one of the buttons. Turns out that the general S3 module starting page will show general actions only (who would've thought?), and that to get the bucket-specific actions you will first need to click one. Ouch.
    • Interesting: we can create a folder structure (with ditto content) inside the buckets to match file references inside the html document.
    • Continuing on with the 'Getting Started with AWS - Hosting a static website' guide we:
    • Configure the buckets
    • Deploy the website
    • Register the domain name via the AWS Route 53 module
    • Associate the domain name with the website
      • We create a 'hosted zone' (sounds fancy) for the domain
        • IMPORTANT:
          • By creating a hosted zone a number (4) of Name Servers were generated. We will need this list of name servers in the 'registered domain' section to link the registered domain to the hosted zone.
      • We create 'record sets' for the root domain and the subdomain)
      • We setup a DNS provider
        • This actually took me quite a while to understand
        • We need to "log into the domain name registrar used to register your domain"
          • This IS the Route 53 module of AWS
        • Then we need to 
          • Use the web interface provided by the registrar to set the name servers for your domain to the name server values displayed under Name Servers in the details for the hosted zone.
        • This is done by:
          • going to the 'Registered Domains' section and clicking on yourdomain.com
          • click Add/Edit Name Servers
            • enter the name servers that were created for the hosted zone in the window which pops up
      • It might take a while for these changes to propagate through the internet but essentially this is when you're done.
    • RECAP MOMENT:
      • We create buckets (in AWS S3) which correspond with a root domain and its subdomains respectively and contain the actual content of the website
      • We register domains (in AWS Route 53)
      • We create a hosted zone (in AWS Route 53) which acts as a in-between between the buckets and the registered domains
    • Also: Task 1... COMPLETED! :-)