31,19 €
AWS CodeDeploy, AWS CodeBuild, and CodePipeline are scalable services offered by AWS that automate an application's build and deployment pipeline. In order to deliver tremendous speed and agility, every organization is moving toward automating their entire application pipeline. This book will cover all the AWS services required to automate your deployment to your instances.
You'll begin by setting up and using one of the AWS services for automation –CodeCommit. Next, you'll learn how to build a sample Maven and NodeJS application using CodeBuild. After you've built the application, you'll see how to use CodeDeploy to deploy the application in EC2/Auto Scaling. You'll also build a highly scalable and fault tolerant Continuous Integration (CI)/Continuous Deployment (CD) pipeline using some easy-to-follow recipes.
Following this, you'll achieve CI/CD for a microservice application and reduce the risk within your software development life cycle globally. You'll also learn to set up an infrastructure using CloudFormation templates and Ansible, and see how to automate AWS resources using AWS Lambda.
Finally, you'll learn to automate instances in AWS and automate the deployment lifecycle of applications. By the end of this book, you'll be able to minimize application downtime and implement CI/CD, gaining total control over your software development lifecycle.
Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:
Seitenzahl: 274
Veröffentlichungsjahr: 2017
BIRMINGHAM - MUMBAI
Copyright © 2017 Packt Publishing
All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews.
Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the author, nor Packt Publishing, and its dealers and distributors will be held liable for any damages caused or alleged to be caused directly or indirectly by this book.
Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information.
First published: November 2017
Production reference: 1221117
ISBN 978-1-78839-492-5
www.packtpub.com
Author
Nikit Swaraj
Copy Editor
Charlotte Carneiro
Reviewer
Gajanan Chandgadkar
Project Coordinator
Virginia Dias
Commissioning Editor
Vijin Boricha
Proofreader
Safis Editing
Acquisition Editor
Meeta Rajani
Indexer
Aishwarya Gangawane
Content Development Editor
Sharon Raj
Graphics
Kirk D'Penha
Technical Editor
Mohit Hassija
Production Coordinator
Shantanu Zagade
Nikit Swaraj is an experienced professional DevOps/Solutions Architect. He understands the melding of development and operations to deliver efficient code. He has expertise in designing, developing, and delivering enterprise-wide solutions that meet business requirements and enhance operational efficiency. As an AWS solutions architect, he has vast experience in designing end-to-end IT solutions and leading and managing complete life cycle projects within optimal time and budget. He also contributes to Kubernetes (open source).
He has been associated with enterprises such as Red Hat and APNs (AWS Partner Network). He is a certified Red Hat/OpenStack architect, as well as being an AWS solutions architect. He also writes blogs on CI/CD with AWS Developer Tools, Docker, Kubernetes, Serverless, and much more.
I would like to express my gratitude to Packt Publishing who have given me the tremendous opportunity to write this book. I would like to thank Meeta Rajani (senior acquisition editor, Packt), who encouraged and enabled me to write this book.
I would like to show my appreciation and thanks to Sharon Raj, Mohit Hassija, and the team at Packt, who saw me through this book, provided support, talked things over, read, offered comments, reviewed, allowed me to quote their remarks, and assisted in the editing, proofreading, and design. This book would not have been completed without your help.
I would also like to thank some of my mentors, with whom I learned and implemented about AWS and DevOps, among which the first name is Tarun Prakash (senior DevOps engineer, MediaIQ Digital) who helped and guided me when I entered the world of DevOps and AWS. I would like to thank Rahul Natarajan (lead cloud architect and consultant, Accenture) who has given me guidance and enabled me to use Developer Tools and related services of AWS and DevOps tools and technologies. I would also like to thank Santosh Panicker (senior technical account manager, Amazon Web Services), under whom I have worked and learned a lot about infrastructure and client requirements. He made me understand which services can be used in the best way. My certifications are a different scenario, but this person had molded me into an actual solutions architect.
I would also like to thank my family and best friends Vijeta and Tanushree for encouraging me and providing me with the emotional support to complete this book.
Gajanan Chandgadkar has more than 12 years of IT experience. He has spent more than 6 years in the USA helping large enterprises in architecting, migrating, and deploying applications in AWS. He’s been running production workloads on AWS for over 6 years.He is an AWS certified solutions architect professional and a certified DevOps professional with more than 7 certifications in trending technologies.Gajanan is also a technology enthusiast who has extended interest and experiences in different topics such as application development, container technology, and Continuous Delivery.
Currently, he is working with Happiest Minds Technologies as Associate DevOps Architect and has worked with Wipro Technologies Corporation in the past.
For support files and downloads related to your book, please visit www.PacktPub.com. Did you know that Packt offers eBook versions of every book published, with PDF and ePub files available? You can upgrade to the eBook version at www.PacktPub.comand as a print book customer, you are entitled to a discount on the eBook copy. Get in touch with us at [email protected] for more details.
At www.PacktPub.com, you can also read a collection of free technical articles, sign up for a range of free newsletters and receive exclusive discounts and offers on Packt books and eBooks.
https://www.packtpub.com/mapt
Get the most in-demand software skills with Mapt. Mapt gives you full access to all Packt books and video courses, as well as industry-leading tools to help you plan your personal development and advance your career.
Fully searchable across every book published by Packt
Copy and paste, print, and bookmark content
On demand and accessible via a web browser
Thanks for purchasing this Packt book. At Packt, quality is at the heart of our editorial process. To help us improve, please leave us an honest review on this book's Amazon page at https://www.amazon.com/dp/1788394925.
If you'd like to join our team of regular reviewers, you can email us at [email protected]. We award our regular reviewers with free eBooks and videos in exchange for their valuable feedback. Help us be relentless in improving our products!
Title Page
Copyright
AWS Automation Cookbook
Credits
About the Author
Acknowledgments
About the Reviewer
www.PacktPub.com
Why subscribe?
Customer Feedback
Preface
What this book covers
What you need for this book
Who this book is for
Sections
Getting ready
How to do it...
How it works...
There's more...
See also
Conventions
Reader feedback
Customer support
Errata
Piracy
Questions
Using AWS CodeCommit
Introduction
Introducing VCS and Git
What is VCS?
Why VCS ?
Types of VCS
What is Git?
Why Git over other VCSs?
Features of Git
How to do it...
Installation of Git and its implementation using GitHub
Introducing AWS CodeCommit - Amazon managed SaaS Git
How to do it...
Getting started with CodeCommit for HTTP users
How to do it...
Setting up CodeCommit for SSH users using AWS CLI
Getting ready
How to do it...
Applying security and restrictions
Getting ready
How to do it...
Migrating a Git repository to AWS CodeCommit
How to do it...
Building an Application using CodeBuild
Introduction
Introducing AWS CodeBuild
How to do it...
How it works...
Pricing
Building a Java application using Maven
Getting ready
Install Java and verify
Install Apache Maven and verify
How to do it...
Building a NodeJS application using yarn
Getting ready
Install NodeJS and verify
Install Yarn and verify
How to do it...
Installing dependencies
How it works...
Building a Maven application using AWS CodeBuild console
Getting ready
How it works...
Building a sample NodeJS application using AWS CodeBuild via Buildspec.yml
Buildspec.yml
Syntax
Getting ready
How to do it...
Deploying Application using CodeDeploy & CodePipeline
Introduction
The Deployment strategy in AWS CodeDeploy
In-place deployment
Blue-green deployment
How to do it...
Writing an application-specific file
How to do it...
Deploying a static application in an EC2 instance from the S3 Bucket using AWS CodeDeploy
Getting ready
How to do it...
How it works...
Introducing AWS CodePipeline and its working
How to do it...
How it works...
Continuous Deployment of static application to AWS S3 using AWS CodePipeline
How to do it...
Building Scalable and Fault-Tolerant CI/CD Pipeline
Introduction
Benefits of using the CI/CD pipeline
How to achieve the benefits?
The scenario
The challenges
CI/CD pipeline workflow
Getting ready
How to do it...
Setting up AWS CodeCommit
Getting ready
How to do it...
Creating the S3 bucket and enabling versioning
Getting ready
How to do it...
Creating the launch configuration and Auto Scaling group
Getting ready
How to do it...
Creating AWS CodeDeploy application using the Auto Scaling group
Getting ready
How to do it...
Setting up the Jenkins Server and installing the required plugins
Getting ready
How to do it...
Integrating Jenkins with all of the AWS developers tools
Getting ready
How to do it...
Understanding Microservices and ECS
Introduction
Understanding microservices and their deployment
Designing microservices
Deployment of microservices
Playing around with Docker containers
Containers
Docker
Images
Registry
Containers
Getting ready
Installation of Docker engine
Run Docker as a non-root user
How to do it...
Running a container
Starting the stopped container
Assigning a Name to a container
Creating daemonized containers
Exposing ports of a container
Managing persistent storage with Docker
Adding a data volume
Getting details of a container
Containerize your application using Dockerfile
Push the image to Dockerhub
Setting up AWS ECR and pushing an image into it
Getting ready
How to do it...
To authenticate Docker client with ECR
Tagging your Docker Image with the repository details
Pushing the image to ECR
Understanding ECS and writing task definitions and services
Getting ready
How to do it...
Verifying containers inside the Container instance
Continuous Deployment to ECS Using Developer Tools and CloudFormation
Introduction
Understanding the architecture and workflow
How to do it...
How it works...
Setting up the infrastructure to host the application
Getting ready
How to do it...
Creating an ECS cluster
Creating a Load Balancer (Classic ELB)
Register Auto Scaling with Load Balancer
Creating an Amazon ECR
Setting Up CodeCommit for our application source
Getting ready
How to do it...
Creating a CodeBuild project for the build stage
Getting ready
How to do it...
Understanding the inside content of helper files (BuildSpec.yml, Dockerfile, and CF template)
How to do it...
Creating a CodePipeline using CodeCommit, CodeBuild, and CloudFormation
Getting ready
How to do it...
IaC Using CloudFormation and Ansible
Introduction
AWS CloudFormation and writing the CloudFormation template
Terms and concepts related to AWS CloudFormation
For YAML
For JSON
How to do it...
Writing a CF template
Defining parameters
Using parameters
Creating stack using the CF template
Creating a production-ready web application infrastructure using CloudFormation
Getting ready
How to do it...
Automation with Ansible
Workflow
Installation
How to do it...
File structure and syntax
Deploying a web server using Ansible
Creating an AWS infrastructure using the Ansible EC2 dynamic inventory
Getting ready
How to do it...
Automating AWS Resource Control Using AWS Lambda
Introduction
Creating an AMIs of the EC2 instance using AWS lambda and CloudWatch
Getting ready
How to do it...
Sending notifications through SNS using Config and Lambda
Getting ready
How to do it...
Configuring the AWS Config service for AWS resources
Creating a Lambda function
Creating a trigger
Streaming and visualizing AWS CloudTrail logs in real time using Lambda with Kibana
Workflow
Getting ready
How to do it...
Enabling CloudTrail logs
Configuring CloudWatch
Creating Elasticsearch
Enabling the streaming of CloudWatch logs in Elasticsearch
Configuring Kibana to visualize your data
Microservice Applications in Kubernetes Using Jenkins Pipeline 2.0
Introduction
K8s architecture
Master components
Node components
Deploying multinode clusters on AWS using the Ansible playbook
Getting ready
How to do it...
Deploying a multinode production-ready cluster on AWS using Kops
Getting ready
How to do it...
Creating bucket
DNS configuration
Creating a cluster
Kubernetes dashboard (UI)
Clean up
Deploying a sample application on Kubernetes
Getting ready
How to do it...
Configuration file
Deployment configuration file
Service configuration file
Working with Kubernetes on AWS using AWS resources
Getting ready
How to do it...
Creating a persistent volume claim
Deployment configuration file (includes ECR image and PVC )
Service configuration file (type Loadbalancer)
Jenkins pipeline 2.0 (Pipeline as Code) using Jenkinsfile
How to do it...
Declarative pipeline
Sections
Application deployment using Jenkinsfile
Getting ready
How to do it...
Create a pipeline in the BlueOcean
Clean Up
Creating a Pipeline using existing Jenkinsfile
Deploying microservices applications in Kubernetes using Jenkinsfile
Getting ready
How to do it...
Workflow
Best Practices and Troubleshooting Tips
Best practices with AWS CodeCommit
Troubleshooting with CodeCommit
Troubleshooting with CodeBuild
AWS provides a set of powerful services that help companies to increase or improve rapid build and reliable build processes to deliver products using AWS and DevOps practices. These services help to simplify the provision and management of infrastructures, the building of applications, and deploying application code in an environment. DevOps is basically a combination of practices, culture, and tools that increase an organization's productivity. It helps to increase the ability to deliver applications and services efficiently. This helps organizations to serve their customers in a better way and to compete more effectively in the market.
You can leverage AWS services for DevOps, meaning you can use AWS services to increase an organization's productivity by automating CI/CD to deliver products quickly. The Developer Tools of AWS include CodeCommit, which uses Git for VCS; CodeBuild, which helps to build the code; CodeDeploy, which helps to deploy application code to servers; and CodePipeline, which helps to integrate all of the previous services to create an automated pipeline. So, this book covers how to use the AWS Developer Tools and how to integrate them with each other. Further, this book covers enterprise-grade scenarios and creates CI/CD pipelines for application deployment. Since this book covers the details of how to use the core services, you can also create your CI/CD pipeline based on your use cases.
This book also covers how to set up production-ready infrastructures using CloudFormation and Ansible. Since many enterprises are migrating their applications to microservices and the best enterprise-grade container orchestration tool is Kubernetes, I will cover how you can deploy Kubernetes on AWS using KOPS, and how you can automate application deployment in Kubernetes using Jenkins Pipeline, which is Pipeline as Code. This book covers the automation of daily jobs and security compliance using AWS Lambda and some other services of AWS services, such as SNS, Config, and Elasticsearch.
Chapter 1, Using AWS CodeCommit, covers the basic concepts of VCS. Here, you will learn how to create a repository in GitHub and upload local files to the remote repository. Then, you will learn CodeCommit in detail and also play with some operations, such as cloning using SSH or HTTPS and migrating from GitHub to CodeCommit.
Chapter 2, Building an Application Using AWS CodeBuild, introduces how to build two different applications developed in Java and NodeJS using CodeBuild. This chapter will also show you how you can use a build specification file.
Chapter 3, Deploying an Application Using AWS CodeDeploy and AWS CodePipeline, covers the basics of the deployment strategy used by CodeDeploy. Then, post that you will learn how to write an application specification file that helps CodeDeploy to deploy an application to the servers. You will also learn how CodePipeline is used to integrate the Developer Tools.
Chapter 4, Building a Highly Scalable and Fault-Tolerant CI/CD Pipeline, includes recipes which include the steps to create a highly scalable and fault-tolerant pipeline. The recipes include setting up CodeCommit, S3 buckets, Auto Scaling, CodeDeploy projects, and more.
Chapter 5, Understanding Microservices and AWS ECS, covers microservices and its deployment. You will also learn to play around with Docker containers. Then, you will learn about ECS and its components, and also how to deploy a containerized application in ECS.
Chapter 6, Continous Deployment to AWS ECS Using CodeCommit, CodeBuild, CloudFormation, and CodePipeline, contains recipes to build a pipeline for the continuous deployment of a containerized application to AWS ECS using other AWS services.
Chapter 7, IaC Using CloudFormation and Ansible, contains the syntax and structure that helps you write a CloudFormation template to spin-up AWS resources. It also includes a CloudFormation template that will help with setting up production-ready infrastructures. The same thing is also mentioned regarding Ansible.
Chapter 8, Automating AWS Resource Control Using AWS Lambda, contains recipes that are related to audit compliance and automation with AWS resources, such as creating an AMI of the EC2 instance using AWS Lambda and CloudWatch, sending notifications through SNS using Config and Lambda, and streaming and visualizing AWS CloudTrail logs in real time using Lambda with Kibana.
Chapter 9, Deploying Microservice Application in Kubernetes using Jenkins Pipeline 2.0, contains recipes covering the deployment of Kubernetes on AWS using KOPS and custom Ansible playbooks. You will also learn to use Jenkinsfile and using Jenkinsfile, deploy a containerized application in Kubernetes.
Chapter 10, Best Practices and Troubleshooting Tips, includes some best practices with CodeCommit and CodeBuild and also covers troubleshooting tips.
The following are the basic requirements to get the most out of this book:
A Linux system (preferably CentOS/Red Hat) with a browser and a good editor
An AWS account
This book targets developers and system administrators who are responsible for hosting an application and managing instances in AWS. DevOps engineers looking at providing continuous integration and deployment and delivery will also find this book useful. A basic understanding of AWS, Jenkins, and some scripting knowledge will be needed.
In this book, you will find several headings that appear frequently (Getting ready, How to do it…, How it works…, There's more…, and See also). To give clear instructions on how to complete a recipe, we use these sections as follows:
This section tells you what to expect in the recipe, and describes how to set up any software or any preliminary settings required for the recipe.
This section contains the steps required to follow the recipe.
This section usually consists of a detailed explanation of what happened in the previous section.
This section consists of additional information about the recipe in order to make the reader more knowledgeable about the recipe.
This section provides helpful links to other useful information for the recipe.
In this book, you will find a number of text styles that distinguish between different kinds of information. Here are some examples of these styles and an explanation of their meaning. Code words in text, database table names, folder names, filenames, file extensions, pathnames, dummy URLs, user input, and Twitter handles are shown as follows: "The reason for this is we have given access to only two operations or actions: git push and git clone."
A block of code is set as follows:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "codecommit:GitPull", "codecommit:GitPush" ], "Resource": "arn:aws:codecommit:us-east-1:x60xxxxxxx39:HelloWorld" } ]}
Any command-line input or output is written as follows:
# git config --global user.name “awsstar”
New terms and important words are shown in bold. Words that you see on the screen, for example, in menus or dialog boxes, appear in the text like this: "Click on Create Policy;then we will have our own custom policy."
Feedback from our readers is always welcome. Let us know what you think about this book-what you liked or disliked. Reader feedback is important for us as it helps us develop titles that you will really get the most out of. To send us general feedback, simply email [email protected], and mention the book's title in the subject of your message. If there is a topic that you have expertise in and you are interested in either writing or contributing to a book, see our author guide at www.packtpub.com/authors.
Now that you are the proud owner of a Packt book, we have a number of things to help you to get the most from your purchase.
Although we have taken every care to ensure the accuracy of our content, mistakes do happen. If you find a mistake in one of our books-maybe a mistake in the text or the code-we would be grateful if you could report this to us. By doing so, you can save other readers from frustration and help us improve subsequent versions of this book. If you find any errata, please report them by visiting http://www.packtpub.com/submit-errata, selecting your book, clicking on the Errata Submission Form link, and entering the details of your errata. Once your errata are verified, your submission will be accepted and the errata will be uploaded to our website or added to any list of existing errata under the Errata section of that title. To view the previously submitted errata, go to https://www.packtpub.com/books/content/support and enter the name of the book in the search field. The required information will appear in the Errata section.
Piracy of copyrighted material on the internet is an ongoing problem across all media. At Packt, we take the protection of our copyright and licenses very seriously. If you come across any illegal copies of our works in any form on the internet, please provide us with the location address or website name immediately so that we can pursue a remedy. Please contact us at [email protected] with a link to the suspected pirated material. We appreciate your help in protecting our authors and our ability to bring you valuable content.
If you have a problem with any aspect of this book, you can contact us at [email protected], and we will do our best to address the problem.
The following recipes will be covered in this chapter:
Introducing VCS and Git
Introducing AWS CodeCommit - Amazon managed SAAS Git
Getting started with CodeCommit for HTTP users
Setting up CodeCommit for SSH users using AWS CLI
Applying security and restrictions
Migrating a Git repository to AWS CodeCommit
In this chapter, we will be working with Git and will mostly play around with AWS CodeCommit. We will set up a repository in AWS CodeCommit using the console, as well as CLI, and enforce a security policy on top of it. We will also migrate the basic Git-based repository to AWS CodeCommit, and will cover some best practices and troubleshooting while dealing with issues on AWS CodeCommit.
VCS comes under the category of software development, which helps a software team manage changes to source code over time. A VCS keeps track of each and every modification to the code in a database. If a mistake is made, the developer can compare earlier versions of the code and fix the mistake while minimizing disturbance to the rest of the team members.
The most widely used VCS in the world is Git. It's a mature and actively maintained open source project developed by Linus Torvalds in 2005.
A version control system (VCS) is the system where the changes to a file (or a set of files) usually get recorded so that we can recall it whenever we want. In this book, we mostly play around with the source code of software or applications, but that does not mean that we can track the version changes to only the source code. If you are a graphic designer or infrastructure automation worker and want to keep every version of image layout or configuration file change, then VCS is the best thing to use.
There are lots of benefits to using VCS for a project. A few of them are mentioned here:
Collaboration
: Anyone or everyone in the team can work on any file of the project at any time. There would be no question where the latest version of a file or the whole project is. It's in a common, central place, your version control system.
Storing versions properly
: Saving a version of a file or an entire project after making changes is an essential habit, but without using a VCS, it will become very tough, tedious, and error-prone. With a VCS, we can save the entire project and mention the name of the versions as well. We can also mention the details of the projects, and what all changes have been done in the current version as compared to the previous version in a README file.
Restoring previous versions
: If you mess up with your present code, you can simply undo the changes in a few minutes.
There are many more features of using VCS while implementing or developing a project.
The types of VCS are mentioned as follows:
Local version control system
:
I
n a local VCS, all the changes to a file are kept in the local machine, which has a database that has all the changes to a file under revision control, for example,
Revision control system
(
RCS
).
Centralized version control system
:
In a centralized VCS, we can collaborate with other developers on different machines. So in these VCS, we need a single server that contains all the versioned files and the number of clients can check out files from that single server, for example,
Subversion
(
SVN
).
Distributed version control system
: In a distributed VCS, the client not only checks out the latest version of the file but also mirrors the whole repository. Thus if any server dies, and these systems were collaborating via it, any of the client repositories can be copied back to the server to restore it. An example of this is Git.
Git is a distributed VCS, and it came into the picture when there was some maintenance needed in the Linux Kernel. The Linux Kernel development community was using a proprietary Distributed version control system (DVCS) called BitKeeper. But after some time, the relationship between the Linux community developers and the proprietary software BitKeeper broke down, which led to Linux community developers (in particular Linux creator Linus Torvalds) developing their own DVCS tool called Git. They took a radical approach that makes it different from other VCSs such as CVS and SVN.
It wouldn't be appropriate to say Git is better than SVN or any other VCS. It depends on the scenario and the requirements of the project. But nowadays, most enterprises have chosen Git as their VCS for the following reasons:
Distributed nature
: Git has been designed as a distributed VCS, which means every user can have a complete copy of the repository data stored locally, so they can access the file history extremely fast. It also allows full functionality when the user is not connected to the network, whereas in a centralized VCS, such as SVN, only the central repository has the complete history. This means the user needs to connect with the network to access the history from the central repository.
Branch handling
: This is one of the major differences. Git has built-in support for branches and strongly encourages developers to use them, whereas SVN can also have branches, but its practice and workflow does not have the inside command. In Git, we can have multiple branches of a repository, and in each repository, you can carry out development, test it, and then merge, and it's in a tree fashion. In SVN, everything is linear; whenever you add, delete, or modify any file, the revision will just increment by one. Even if you roll back some changes in SVN, it will be considered a new revision:
Smaller space requirements
: Git repositories and working directory sizes are very small in comparison with SVN.
The following are some of the features of Git:
Captures snapshots, not entire files
: Git and other VCSs had this major difference; VCS keeps the record of revisions in the form of a file. This means it keeps a set of files for every revision. Git, however, has another way of accounting for changes. Every time you commit or save the state of your project in Git, it basically takes a snapshot of what your files look like at that very moment and stores a reference to that snapshot. If files have not been changed, Git does not store the file again; it stores a link to the previous identical file it has already stored.
Data integrity
: Before storing any data in a Git repository, it is first checksummed, and is then referred to by that checksum. That means, if you carry out any other modification in the file, then Git will have every record of every modification. The mechanism used by Git for checksumming is known as SHA-1 hash.
SHA-1 hash looks something like this:
b52af1db10a8c915cfbb9c1a6c9679dc47052e34
States and areas: Git has three main states and views all files in three different states:
Modified
:
This is the modification that has been done in the file, but not yet written or committed in the database.
Committed
: This ensures that the source code and related data are safely stored in your local database or machine
Staged
: This ensures that the modified file is added in its current version and is ready for the next commitment.
Here are the steps and commands that will guide you through installing and setting up Git and creating a repository in a very famous self-hosted Git, GitHub.
If you want to use Git, we have to install the Git package on our system:
For Fedora distributions (RHEL/CentOS):
# yum install git
For Debian distributions (Debian/Ubuntu):
# apt-get install git
Configure your identity with Git because every Git commit uses this information, for example, the following commit has been done by User
awsstar
and email is
:
# git config --global user.name “awsstar”# git config --global user.email “[email protected]”
Check your settings. You will find the above username and email-id:
# git config --list
Now, let's try to create a repository on GitHub:
Hit
www.github.com
in your web browser and log in with your credentials
Click on create
New Repository
Then, we will get something like the following screenshot. We have to mention the Repository name and a Description of the repository. After that, we need to select Public or Private based on our requirements. When we opt for Public, then anyone can see your repository, but you pick who can commit; when you opt for Private, then you pick who can see and who can commit, meaning by default it won't be visible to anyone. After that, we have to initialize the README, where we can give a detailed description of the project and click on Create Repository:
Once we have a repository,
HelloWorld