Amazon acquires a new Disaster Recovery Solution from CloudEndure.

Amazon has recently purchased anew disaster recovery solution and this service is now available in the Amazon Marketplace.

It was gathered that AWS purchased CloudEndure for an approximate sum of $250 million.

CloudEndure focuses on uptime and capable of making application replication across different cloud providers or data centres in case of a disaster (outage) at the primary Data Center or Cloud Provider.

This is currently supported already on some cloud solution. With can move their IT infrastructure into different cloud providers through its Migration tools.

Sources:
https://www.geekwire.com/2019/reports-amazon-adds-disaster-recovery-tech-aws-purchase-cloudendure/
https://www.timesofisrael.com/israels-cloudendure-confirms-acquisition-by-amazon-cloud-services-unit/

How to Configure AWS CLI

Note: The AWS CLI must be configured on your local work station. simple by downloading it from AWS.

There are different ways to install this but my preferred method is by using an installer which can be downloaded from this site. https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-install.html

Use-Case: Install AWS CLI on Windows
Note: You can also have it installed on MacOS and linux. Please See the URL above for the steps.

The installer can be downloaded here for Windows: https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-install.html

After installation, Please verify, it has been installed correctly by using the command below.

aws --version

Follow the next steps below to configure the AWS CLI

$aws configure
AWS Access Key ID [None]: Your Access Key ID 
AWS Secret Access Key [None]: Your Secret Key
Default region name [None]: Enter your desired region in this format (us-west-2)
Default output format [None]: I usually leave this by default as (json)

When you type this command, the AWS CLI prompts you for four pieces of information and stores them in a profile (a collection of settings) named default. This profile is then used any time you run an AWS CLI command that doesn’t explicitly specify a profile to use.

https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html

How to Copy from a Server to s3 and Vice Versa

s3: Which is known as Simple Storage Service by AWS is an object storage service which offers great scalability, data availability, security, and performance.

Prerequisite:

Note: The AWS CLI must be configured on your local work station. simple by downloading it from AWS. I will create a blog for this in the future. After installing AWS CLI for your Operating System (windows, MacOS or linux)

Follow the next steps now to configure the AWS CLI

$aws configure
AWS Access Key ID [None]: Your Access Key ID 
AWS Secret Access Key [None]: Your Secret Key
Default region name [None]: Enter your desired region in this format (us-west-2)
Default output format [None]: I usually leave this by default as (json)

After configuring the AWS CLI, next would be to enter the right syntax to copy files to and from s3.

Note: You will need to have the right permissions in place to perform these task.

Here are the use cases

Copy from s3 to Host: Here we have a bucket sitting at the root level with a folder in it and file inside the folder. 
Then you will need to specify the path on your server you want this to be copied to.

aws s3 cp s3://s3bucketname/s3foldername/fileexample.7z G:fileexample.7z

copy from host to s3: What this does is simple to copy the file in drive C and upload it to s3. Here we have the bucket, folder inside the bucket and the file that will reside inside the bucket. followed by the right region you want the file to be uploaded to

aws s3 cp C:filename.zip s3://s3bucketname/s3foldername/fileexample.zip --region ap-northeast-1

Chef – Node Bootstrapping

The Chef-Client agent runs Chef recipes on both physical and virtual servers often referred to as nodes or instances.  Associating nodes with Chef ensures the chef server installs the chef-client software on the nodes.

Note: The minimum supported version of chef-client on nodes associated with OpsWorks for ChefAutomate Server is 12.16.42. We recommend running chef-client 13.6.4.

Using knife to add or bootstrap an instance (See https://docs.chef.io/platforms.html for supported operating systems)

Example 1: Bootstrapping an instance using knife commands. 

knife bootstrap windows winrm EnterNodeIPAddress --winrm-user Administrator --winrm-password 'enterserverpasswordhere' --node-name EnterNodeNamehere --run-list 'recipe[chef-client]'

Note: You can attach any cookbooks in place of the chef-client run-list, or have a role uploaded to the chef server which in turns have many run-list attached to the role and update the node run-list to use the newly created role. This topic will be discussed in a latter stage.

Also see if you wish to add nodes automatically https://docs.aws.amazon.com/opsworks/latest/userguide/opscm-unattend-assoc.html

Chef-Repo Setup

As discussed in https://techdirectarchive.com/2018/12/14/get-started-with-opsworks-for-chef-automate/ 
It is recommended to placed in Git of any of the VCS your cookbooks etc.

A Chef repository contains several directories. Each directory in the StarterKit contains a README file that describes the directory’s purpose, and how to use it for managing your systems with Chef. There are two ways to get cookbooks installed on your Chef server. Either by using the,
– knife commands, or
– Berkshelf commands. 

Create a directory on your local PC for storing cookbooks similar to the Chef-repo

  • Create a directory on yourlocal computer for storing cookbooks, such as chef-repo. After you addcookbooks, roles, and other files to this repository, we recommend that youupload or store it in a secure, versioned system, such as AWS CodeCommit, Git,or Amazon S3.
  • In the chef-repo directory, create the following three directories, as shown in the Starter Kit:
    1. cookbooks/ – Stores cookbooks that you download or create.
    2. roles/ – Stores rolesin .rb or .json formats.
    3. environments/ – Stores environments in .rb or .json formats

Use-Case 1
Use Berkshelf to get Cookbooks from the Chef Supermarket:

Berkshelf is a tool for managing cookbooks and their dependencies. It downloads a specified cookbook into your local storage, which is called the Berkshelf. You can specify which cookbooks and versions to use with your Chef server and upload them.

First: Let’s try this very basic example.
Step 1: Navigate to Chef supermarket https://supermarket.chef.io/ and search for google chrome and Firefox.

Then locate the Berksfile as shown below and add the Firefox and google chrome codebooks as shown below. This ensures the cookbooks are download and stored locally.

Locate the path where you have the starter kit downloaded to, mine is 

C:UsersyournameChefAutomatechef-automateserver

Let’s modify the Berksfile and add the cookbooks for Firefox and chrome gotten from the Chef Supermarket.

Step 2: Download and install the cookbooks on your local Computer
– Launch the Chef Dk Client (PowerShell)

Navigate to the Chef local Repository from the Chef Dk terminal otherwise this will not work if you are in a different directory. See image below and run the command. See screenshoot below for the Repo content.

Step 3: Upload this cookbook to Chef Server

This is difference on how it works on Linux than on Windows.
See link: https://docs.aws.amazon.com/opsworks/latest/userguide/opscm-starterkit.html

For Windows: Ensure the Execution Policy in PowerShell is set to RemoteSigned. The execution policy ensures you can load configuration files and determine which script if any must be digitally signed. See the image below on how to set the execution policy to RemoteSigned.

See https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.security/set-executionpolicy?view=powershell-6
This simply restricts us from running unsigned PowerShell scripts downloaded from the Internet.

Run berksinstall

Otherwise, you wouldn’t be able upload your cookbooks. Verify if this is installed, by running the command below in the console

berks –version

Next run the command below to add the SSL Cert file.

$env:SSL_CERT_FILE="ca_certsopsworks-cm-ca-2016-root.pem"

Then run berks upload to upload to the Chef Server

'berks upload'

Finally remove the SSL Cert File

Remove-Item Env:SSL_CERT_FILE

To verify that this worked, simply run the command “knife cookbook list” as shown below

https://docs.aws.amazon.com/opsworks/latest/userguide/welcome_opscm.html

Get Started with OpsWorks for Chef Automate

Provision a Chef Automate Server in AWS. Follow this link https://aws.amazon.com/opsworks/chefautomate/ and ‘click on get Started with OpsWorks for Chef Automate’

When setup is complete, ensure you download the Sign-in credential for the Chef Server as this is not saved by AWS OpsWork for Chef Automate and cannot be downloaded afterwards or viewed.

StarterKit: Download and Unzip the Starter Kit. TheStarter Kit contains a README file with examples, a knife.rb configuration file, and a private key for the primary or pivotal user.

Note: A new key pair is generated—and the old key is reset—each time you download the Starter Kit.

The Starter Kit .zip file includes a simple example of a Chef repository that works with any AWS OpsWorks for Chef Automate Server. In the Chef repository, you store cookbooks, roles, configuration files, and other artifacts for managing your nodes with Chef. It is recommend that you store this repository in a version control system, such as Git or AWSCodeCommit and treat it as source code.

Note: Never share the Starter Kit private key. If other users will be managing the Chef server, add them as administrators in the Chef Automate dashboard.

Chef Development Kit: Download and run on the computer that will administer the Chef Server and Nodes. The knife utility is also part of the Chef DK.
https://docs.aws.amazon.com/opsworks/latest/userguide/welcome_opscm.html