torstai 10. joulukuuta 2015

Shellscripting like a boss

It is good idea to put all your shell script commands in a directory and export it to the path. In order to do this open up your .bashrc file of .profile

nano ~/.bashrc

Add this line to it 

export PATH=$PATH:~/scripts

It means in your home folder there should be a folder named scripts. Like /home/username/scripts.

Then reload your bashrc file 

source ~/.bashrc

And now you are ready for scripting. Add new file named hello to the script folder. 

Add this

#!/bin/bash 
echo My first program

Then in any directory type hello and your script executes.

Adding parameters to script:

#!/bin/bash

function echoshit() {
    echo "No parameter found or wrong parameter"
}

if [[ $1 == --trolli ]];
then
  echo "trolli echoed"
else
  echoshit;
fi


Read more: http://www.bashguru.com/2009/11/how-to-pass-arguments-to-shell-script.html

keskiviikko 21. lokakuuta 2015

Hacking with Kali Linux

I was curious about hacking things and the so called "dark side of the internet" so I installed Kali linux to my virtualbox and checked out few things.

Finding website admin panel with dictionary scan


So first thing I wanted to check is how you can find admin panels. In every content management systems (CMS) there must be admin panel to login and maintain it. So I found perl script that scans target site with different admin panel names. Unfortunately this is so called dictionary attack and if the name doesnt happen to be in the list, it wont find anything.

Open up your Kali linux and download this file: http://www.2shared.com/complete/R1eEFhs3/def_adminfinder.html

CD to your download folder and just run:

perl def_adminfinder.pl 

and it will launch. Then the script will ask the target site, type your site and enter.


Then we can see the script checking possible admin panel locations.


You have to wait for the script to finish to see the results unless you can spot status: found from the output stream. This is very simple tool to find admin panel, but it is a weak dictionary tool. This is not even hacking, cracking what so ever. I would say it is a tool.

How to generate a password list


When hacker is bruteforcing in to a system, it basically means to try every possible combination of numbers, letters and special marks that are defined. For that we need to create a list of those words. Linux can generate these files with a tool called crunch. 

type in: crunch 4 4 123456 > passwords.lst

The first number means how long the password should be atleast (minimum). Second number is how long it should (maximum) and then I have defined combinations with numbers 123456 and put them to a file called passwords.lst. Now this command will create every single possible combination of 123456 length of four. 





As you can see we have different combinations in a list. Try next crunch 2 2 abcd > passwords2.lst
to explore how this works.


Then we have all the combinations with abcd length minimum and maximum 2

Crunch is a tool to create password lists for bruteforce attacks which can take very long time. You can also download most used password lists by googling a little bit.

Password attack with Hydra to basic authentication


Now if you have website which has basic authentication you could create a massive list of words and then "bruteforce" yourself in to that site. Other way is to download ready made password list to speed up things a little bit, you can find some here: https://github.com/danielmiessler/SecLists/tree/master/Passwords

Basic authentication box will look like this:



Give command:

hydra -L accounts.txt -P passwords.txt http://www.yoursite.com

-L gives account list as parameter and -P option gives the password list. Then hydra will try every combination with the words in the lists. 


And the account + password combination in my list matched. I have successfully logged in.


SQLmap injection tool

if PHP page url looks like this: page.php?id=1 you can try if the page is vulnerable to sql attacks by adding ' to the end of url like www.yourpage.com/page.php?id=1'

If the answer you get is: "You have error in your sql syntax" the page is vulnerable to attacks. 

Simple command to find out databases is: sqlmap -u www.yoursite.com --dbs 
This will try different sql injections and list all the databases available.  


And as you can see, I found vulnerable website. Now I know their technology and what databases they have. This is just short demonstration what you can do with this tool.

To continue checking what is inside this table use command:

sqlmap -u www.yoursite.com/ -D information_schema --tables

This will check what is inside information_schema table.

Scanning email addresses with harvester

If you want to do email scan using search engines there is a tool called "theharvester". Fire up your Kali Linux and type theharvester to get information about the app.

You can scan emails with command theharvester -d www.nikiahlskog.com -l 50 -b all

From the information we can find that -d is the url we are searching. -l is the amount of results we look and -b is the search engine if I understood this correctly. After scannin my own site I have found 3 email addresses, but none of them is real. 





Using Hydra to hack login form


To attack login form you need: passwordlist and usernames list. User command:

hydra -L usernames.txt -P passwords.txt testsite.com http-get-form "/index.php:admin_username=^USER^&passwordfield=^PASS^:Denied"

-L is the usernamelist, -P is password list. http-get gets the right page, then comes the username field we try to hack, then password field and after : we put a word that hydra will be looking if the login is denid. 

Cracking Wlan passwords with reaver

Will continue this later probably...

airmon-ng start/check/stop
airodump-ng wlan0mon
wash -i wlan0mon -C
reaver -i wlan0mon -b BSSID --fail-wait=360

airodump-ng -bssid -c 6 --write/root/Desktop/crack-wpa wlan0mon

Free VPN with vpnbook and openvpn, how to use with Linux

VPN the tool for everyone who wants to be anonymous on the internet! I am happy that there is a place called http://www.vpnbook.com/ which is 100% free.

Start by surfing to http://www.vpnbook.com/freevpn and download for example  Euro1 OpenVPN Certificate Bundle to your Linux computer. It will be a zip file, so extract it somewhere in your system.

Next install openvpn client. sudo apt-get update && sudo apt-get install openvpn 

This will update your repositories and install openvpn client. Then CD to the folder where you have extracted the certificate bundle. To start VPN use command:

sudo openvpn --config vpnbook-euro1-tcp443.ovpn for example. It will ask username and password which you can find here: http://www.vpnbook.com/freevpn

Then just type in the credentials and wait few seconds. After that try geolocation finder and you can see your IP and country has changed! Notice that the command needs to be run with sudo in order to work.


You need to leave terminal open as long as you want to run the VPN. To close connection hit
CTRL + C

I am originally from finland, but after VPN tunnel my location is:


Also notice that the certifications and passwords may change occasionally, so always keep them updated.

Backup your home server to usb hdd

I needed to take backup of my server and wanted to take whole disk. I found this useful tool called dd packed out of the box in linux debian server.

Check which one you want to put in image file: sudo fdisk -l


Here we can see that I have /dev/sda as my main partition. Then to backup it on usb HDD find where your usb device is located. Usually it is under /media

Then just give simple DD command:

sudo dd if=/dev/sda of=/media/usb0/backup.img

This will create .img file as big as your hard drive is, so for example my dev/sda is 120gb so it will create 120gb .img file. So make sure your usb hdd has enough free disk space.

Second thing to note is that the program will run as long as it takes, for example 160gb hdd backup took 80 minutes for me. You need to leave the terminal open for that time. You can monitor the progress by checking the file size with: ls -l /media/usb0/ and spamming that command you can see how big the file is. There is no verbose option as far as I know.

tiistai 9. kesäkuuta 2015

Testlink for testcase management

To get TestLink to work, do virtualhost file which allows override.

<VirtualHost *:8010>

        ServerAdmin Niki
        ServerName testlink
        ServerAlias testlink
        DocumentRoot /home/shnigi/public_html/testlink/

        <Directory /home/shnigi/public_html/testlink/>
                Options Indexes FollowSymLinks
                AllowOverride All
                Order allow,deny
                Allow from all
        </Directory>

</VirtualHost>

maanantai 8. kesäkuuta 2015

Frontend stuff Node.js, NPM, Bower & Gulp

NodeJS


On the latest version of Ubuntu, you can simply:

sudo apt-get install nodejs nodejs-dev npm which will install nodejs and npm to your computer.

Create new folder to play with. CD to that folder

Create file named: "hello_node.js"

insert:

// hello_node.js
var http = require('http');
http.createServer(function (req, res) {
  res.writeHead(200, {'Content-Type': 'text/plain'});
  res.end('Hello Node.js\n');
}).listen(8124, "127.0.0.1");
console.log('Server running at http://127.0.0.1:8124/');

Run the command by typing node hello_node.js in your terminal and leave it open.

Now, if you navigate to http://127.0.0.1:8124/ in your browser, you should see the helloworld message.


Bower init & NPM init


Install bower and gulp globally with command:

npm install -g bower
npm install -g gulp

Now we have bower and gulp globally installed. Make a new folder for your project and go in to that folder. Then do

npm init

Which will launch generator for package.json. When you have set all stuff correctly accept it. Now you can use

npm install command which will install all the required modules. To get more modules do

npm install bower --save-dev which will install bower to your project and --save-dev will put bower to package.json file as dev dependency. You can also use bare --save which will then appear as dependency. 

After you have install bower, run command bower init which will trigger bower.json file to be created. Now you can install bower components by running command bower install bootstrap --save which will install bootstrap under bower components folder. 


Flags --save means that the module is a dependecy for your project. Without that, it can't work. Or may work but not correctly. Setting flag --save-dev means that the module is critical for developer but not necessary in production version.



Gulp


Gulp is meant to automate and enhance you workflow. You can basically set tasks for gulp and it will do all the things for you automatically. First add some dev dependencies for our project.

$ npm install gulp-ruby-sass gulp-autoprefixer gulp-minify-css gulp-jshint gulp-concat gulp-uglify gulp-imagemin gulp-notify gulp-rename gulp-livereload gulp-cache del --save-dev

Then create empty file named gulpfile.js

And now for the lazy part. Head over here to read about the gulpfile. Very well explained: http://www.sitepoint.com/introduction-gulp-js/

torstai 7. toukokuuta 2015

SSH Git-repository

On ssh-server install git:

$ sudo apt-get install git
Create empty folder for git:
$mkdir gitkansio && cd gitkansio
Initialize git repository
$git --bare init

On local machine:

Create Git folder with Git:
$mkdir gittest && cd gittest
$sudo apt-get install git
$git init
Create some files
$ echo "lol" >> test.txt
Commit to Git
$git add .
$git commit -m "testing git"
Add remote repository
$git remote add origin ssh://username@server.com:1234/home/user/gitkansio
Push to remote
$git push origin master
$ git branch --set-upstream master origin/master

Clone the repository:
$ git clone ssh://gitmies@www.nikiahlskog.com:4242/home/gitmies/gitkansio

to remove: $ git remote rm origin

Check remotes with command: git remote -v 
Set new remote: git remote set-url origin ssh://gitman@365rent.fi:2222/home/gitman/365rentrepo

keskiviikko 6. toukokuuta 2015

Development mode on with LAMP

I was wondering how to setup development mode on with LAMP.

What you have to do is to set in /etc/apache2/ports.con this: Listen 127.0.0.1:80
That means, only localhost can access apache and no one else can see it.

Second thing, setup php development mode on by editing file /etc/php5/apache2/php.ini and set display_errors = On from Off value.

Create apache benchmark logs

ab -n 100 -c 20 http://localhost/ > benchmarklog-$(date +\%d\%m\%y).txt

ab -n 100 -c 20 http://localhost/ > /home/shnigi/public_html/ablogs/log-$(date +\%d\%m\%y).txt

tiistai 28. huhtikuuta 2015

Backup Wordpress Database

This should get a full backup of your database including users.
mysqldump --routines --flush-privileges --databases DBNAME > /home/USERNAME/dump-$(date +\%d\%m\%y).sql -u root -pPASSWORD


Then just add the database to your new server and remember to get wp-config.php file also. 

Update Wordpress and file permissions under Linux

To update Wordpress without FTP, install plugins and upload media do the following:

Insert to Wordpress config.php file this line: define('FS_METHOD','direct');

Then set under wp-content folder plugins and uploads folder to user www-data

chown www-data plugins
chown www-data uploads



The left one in permissions list is user and on the right group.

TO UPDATE you need to give www-data access for a short time. So do sudo chown www-data wordpress/ -R then update, and after that give access back to your user.


Setting the folder upgrade for www-data should allow upgrading, but unfortunately I couldn't test this. Using chomd 777 -R for the whole wordpress folder allows updating, but that is not a good idea.

Wordpress defaults are 755 for folders, and 644 for files. To set things back use chmod 755 -R and then find:

To recursively give directories read&execute privileges:
find /path/to/base/dir -type d -exec chmod 755 {} +
To recursively give files read privileges:
find /path/to/base/dir -type f -exec chmod 644 {} +

Or, if there are many objects to process:
chmod 755 $(find /path/to/base/dir -type d)
chmod 644 $(find /path/to/base/dir -type f)

Or, to reduce chmod spawning:
find /path/to/base/dir -type d -print0 | xargs -0 chmod 755 
find /path/to/base/dir -type f -print0 | xargs -0 chmod 644

keskiviikko 15. huhtikuuta 2015

Scan all IP addresses in the same local network

Scanning through all IP addresses in same local network is easy task with Linux.

Install arp-scan:
$sudo apt-get install arp-scan

Scan all ip's:
$sudo arp-scan --interface=eth0 --localnet

Where "eth0" is your network device, if you don't know what it is, do ifconfig to check it out:
$ifconfig


maanantai 13. huhtikuuta 2015

Linux aliases

You can create aliases to your linux system by editing file:

~/.bash_aliases

Update aliases by running:

sudo apt-get update

For example if you install composer:

curl -sS https://getcomposer.org/installer | php

https://getcomposer.org/download/

You can then create alias for composer by editing the bash file and typing:

alias composer='/usr/local/bin/composer'

Now you can run composer anywhere just by typing composer

Sources:
http://ss64.com/bash/syntax-bashrc.html

Crontab timed mysql dump and commands

Crontab is timing program included in Linux systems. With crontab you can execute commands automatically on specific times and dates.

To open crontab give command $crontab -e 

You can run this command with root or non root person. If you want to do root things or you need to do command as root use root.

* * * * * command
┬ ┬ ┬ ┬ ┬
│ │ │ │ │
│ │ │ │ └──── day of the week (0–7) (sunday = 0 or 7)
│ │ │ └────── month (1–12)
│ │ └──────── day (1–31)
│ └────────── hours (0–23)
└──────────── minutes (0–59)


Here are few examples how to use crontab. For example I want to make a tarball from my homepage. This command would do the tarball under /home/shnigi and tarball name would be mysite.tar.gz. The tarball is created from /home/shnigi/public_html the command would run every day 15:12

12 15 * * * tar -czvf /home/shnigi/mysite.tar.gz /home/shnigi/public_html

To backup your MySQL database use command:

30 15 * * * /usr/bin/mysqldump --all-databases > /home/shnigi/dump.sql -u root -pPASSWORD

This would run every day 15:30 and take backup about all databases to /home/shnigi/dump.sql. Note that password is inserted right after the -p without space. The code under this is the same, but it generates date for every mysql dump file.

12 16 * * * /usr/bin/mysqldump --all-databases > /home/dump-$(date +\%d\%m\%y).sql -u root -p

You can try these commands without crontab, just use these:

#SQL dump
32 16 * * * /usr/bin/mysqldump --routines --flush-privileges --all-databases > /home/dump-$(date +\%d\%m\%y).sql -u root -pPASSWORD

#backup all home folders
26 16 * * * tar -czvf /home/homefolders.tar.gz /home

Sources:
https://www.linode.com/docs/databases/mysql/back-up-your-mysql-databases
http://www.thegeekstuff.com/2011/07/cron-every-5-minutes/

Remove .php extension

To remove .php extension from your site url, for example testsite.com/blog.php to just testsite.com/blog you need to do the following in your system:

$sudo a2enmod rewrite which allows apache to rewrite names. Then create hidden .htaccess file in your website root folder and type:

RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^([^\.]+)$ $1.php [NC,L]
Edit your navigation page and remove all extensions. Like this: 


You also need to edit your virtualhost files which allows this. 



keskiviikko 25. maaliskuuta 2015

Robot Framework : Installation on local Linux and VPS. XML + Selenium2Library. Jenkins script.

http://robotframework.org/
Robot framework is generic test automation framework for acceptance testing, ATDD.

In this tutorial I am going to use Selenium2Library. Robot framework is based on libraries. If there is no library for your needs you can just create your own.

Test cases are written in a .txt or .robot files. Test cases are keyword based and can be parameterized. Language is Python, so think creating a test case like row of tables. Tabulator or two spaces between code.


Keyword Argument Data
Click Button for example locator (id, name etc.)
Insert text name Jack

Selenium library documentation: http://rtomac.github.io/robotframework-selenium2library/doc/Selenium2Library.html#Click%20Button

To install Robot Framework on Xubuntu:

$ sudo apt-get install python-pip
$sudo pip install robotframework
$sudo pip install robotframework-selenium2library
$ python
>> import Selenium2Library
>>exit()

And now you should have working Robot Framework including Selenium2Library. 

Create new test case:

$mkdir test
$cd tests
$nano test.robot

*** Settings ***
Library Selenium2Library
*** Variables ***
${PAGE} http://www.amazon.com
*** Test Cases ***
User must be able to open amazon.com
[Documentation] Type in Laptop and search
[Tags] Smoke
Open Browser ${PAGE} ff
Input Text twotabsearchtextbox Laptop
Click Button Go
Close Browser
*** Keywords ***

Under settings we have imported the selenium library. In the variables I have inserted one variable that can be used anywhere in the test case. Under test cases the first one is test name, documentation is about the test what will be done etc and with tags tests can be linked together and it is easier to found them in report files. So "Open Browser" is selenium library keyword, page is my variable and ff means firefox.

And now to run your test:
$pybot test.robot


My computer opened browser and went to amazon OK! Everything working. Robot Framework created some log and report files to the same folder that our test case lies. 


Green is good!

pybot -d /some/path will move logfiles elsewhere. $pybot -d results tutorial.robot

To run multiple test sets at one, do $pybot foldername

example and results in a tree view: pybot -d /some/path tests (this will run all test sets under tests folder and put results to results folder, which keeps things separate. It will be easier to maintain things.




Check my example script:
 https://github.com/shnigi/robotframework

Install on VPS / Vagrant


To run robotframework on a vagrantbox or VPS you need to run browser headless. This means robotframework can't run without faking screen. In order to do this install firefox:

$sudo apt-get install firefox

Install software to fake screen: $sudo apt-get install xvfb

Add a "display": $sudo Xvfb :10 -ac

Run firefox on the fake display:  $export DISPLAY=:10 firefox

And now you should be able to run robotframework tests in your vagrant / vps.

Closer look to Robot files


Robot framework should be installed to CI server as well as for developer computer. This means developer can run test sets and when CI build is triggered it will also run the tests. 

Create file resource.robot to the tests folder containing all test sets. This file will act as general settings file, which contains all basic variables, urls and tasks like open browser, maximize it etc. 

Then you can use those keywords in a test file. Here is my example:

Robot Framework doesnt create very sleek reports, so it would be useful to use some test case management software for project reporting and test case planning, like Quality Center or TestLink. Of course good old excel sheet will also do. Some people use only robot framework test files because they are pretty easy to understand if you understand robot and programming basics.

Reading XML files


Robotframework has built in library for reading xml. The principles are same than in the selenium2library, but you dont have to import it in python console beforehand. It is already there. 

XML documentation can be found here where all the keywords are explained: http://robotframework.org/robotframework/latest/libraries/XML.html

XML .robot file looks like this:

*** Settings ***
Library           XML

*** Variables ***
${XmlFile}        lol.xml

*** Test Cases ***
Parse-Xml-Test
    ${root}=    Parse XML    ${XmlFile}
    Should Be Equal    ${root.tag}    example
    ${first}  Get Element  ${root} first
    Should Be Equal  ${first.text}  text

And xml file is the same than in xml documentation:

<example>
  <first id="1">text</first>
  <second id="2">
    <child/>
  </second>
  <third>
    <child>more text</child>
    <second id="child"/>
    <child><grandchild/></child>
  </third>
  <html>
    <p>
      Text with <b>bold</b> and <i>italics</i>.
    </p>
  </html>
</example>

It goes through the xml file, checks that the xml root tag is "example" and first element text is "text" simple isn't it?


Jenkins


I created small shellscript to do Jenkins build. My system has user called "jenkka" the only thing Jenkins does is that it first goes to jenkkas home folder where the test.robot file lies, does all tests and puts results to var/www. Simple as that.

#!/bin/sh
cd /home/jenkka
pybot -d /var/www test.robot

Best Practices


RobotFramework best practises are to create ShellScript, that will trigger wanted tests. For example we could do a shell script named "run_qa_tests.sh" which has code:

#!/bin/bash
source executerobot.sh
runQAAcceptanceTests

Then we would have script named "executerobot.sh" which has a function named runQAAcceptanceTests.

#!/bin/bash
# executerobot.sh: a  shell script to run robot tests

function runQAAcceptanceTests() {
  pybot some code here -d results .
}

Trigger the first script with command: ./run_qa_tests.sh which will then run our function. The dot . after the pybot command will run all tests recursive under our main robot test folder. By doing this, we can run tests in our local environment and test server. We can define parameters for both cases and set jenkins properly.

To run a single test use shell command:  pybot -d results --test "testname" . Which will recursively find test case named testname. Last dot does the recursive search.

To add a variable file, which may contain for example test data numbers etc, you can give parameter like: --variablefile ./resources/testdata.py and the python file contains data which then can be used as variable ${variable}

When doing ATDD (Acceptance Test Driven Development) or BDD (Behaviour Driven Development), the following grammar is widely spread to formulate pre- and postconditions of a test:
  1. Given: The static preconditions
  2. When: The behaviour under test (or that, which should be specified)
  3. Then: The expected results of the behaviour under the given preconditions

Tutorial end


Notice for myself: There is syntax highlighter for ATOM editor: https://atom.io/packages/language-robot-framework

Ride is testdata editor for robotframework. Install it with: pip install robotframework-ride

Read more about creating test case: http://en.wikipedia.org/wiki/Robot_Framework

Sources:
http://datakurre.pandala.org/2014/03/cross-browser-selenium-testing-with.html
http://testingknols.blogspot.fi/2014/05/robot-framework-installation-on-ubuntu.html
http://seleniummaster.com/sitecontent/index.php/selenium-robot-framework-menu/selenium-robot-framework-python-menu/221-read-xml-file-in-robot-framework-python


keskiviikko 25. helmikuuta 2015

How To Install MediaWiki on user directory

MediaWiki is a free software open source wiki package written in PHP, originally for use on Wikipedia. It is now also used by several other projects of the non-profit Wikimedia Foundation and by many other wikis, including this website, the home of MediaWiki.
http://www.mediawiki.org/

Now I am going to install MediaWiki on my personal server. To install MediaWiki you will need a LAMP stack = Linux, Apache, MySQL, PHP. Possibly ssh connection.

Step 1


First head over to
http://releases.wikimedia.org/mediawiki/

Find the latest version available and copy its address. Currently the newest version is 1.24. http://releases.wikimedia.org/mediawiki//1.24/mediawiki-1.24.1.tar.gz

Copy that address and then head over to your user public_html folder. Use commands wget to download and tar to untar the file:

$ wget http://releases.wikimedia.org/mediawiki//1.24/mediawiki-1.24.1.tar.gz
$ tar xvfz mediawiki-1.24.1.tar.gz

You may need to restart your apache

$ sudo service apache2 restart

Step 2

Create MySQL table and user.

$ mysql -u root -p
$ create database my_wiki;
$ grant index, create, select, insert, update, delete, alter, lock tables on my_wiki.* to 'wikiuser'@'localhost' identified by 'password';
$ flush privileges;
$ exit;

step 3

head over to your MediaWiki site www.yoursite.com/~username/mediawiki-1.24.1
Click install and insert all things asked. 
When your install is finished MediaWiki will automatically download LocalSettings.php file. Upload this file to the root of your MediaWiki installation.

Step 4 Enjoy your MediaWiki site

Step 5 Images / Uploads

To allow image uploads to your mediawiki site you need to edit LocalSettings.php file and give chmod 777 to images folder. 

$nano LocalSettings.php 
Set line $weEnableUploads = true;
$ chmod 777 images/

 

keskiviikko 4. helmikuuta 2015

Trying Bitbucket

Introduction

Bitbucket is like Github. Which one should I use? Both GitHub and Bitbucket offer great Git services, but each has its own features and pricing plans.

Answer, use both. Use GitHub for open source and public repos and Bitbucket for private repos.

Interface

Bitbucket dashboard


Github dashboard
To start using Git service with Linux operating system, install Git first.

$ sudo apt-get install git

Step 1: Check for SSH keys
First, we need to check for existing SSH keys on your computer. Open up your Terminal and type:
ls -al ~/.ssh
# Lists the files in your .ssh directory, if they exist
Check the directory listing to see if you already have a public SSH key. The default public key file names are:
  • id_dsa.pub
  • id_ecdsa.pub
  • id_ed25519.pub
  • id_rsa.pub
Step 2: Generate a new SSH key
To generate a new SSH key, copy and paste the text below, making sure to substitute in your email address. The default settings are preferred, so when you're prompted to "Enter a file in which to save the key", just press Enter to continue.
ssh-keygen -t rsa -C "your_email@example.com"
# Creates a new ssh key, using the provided email as a label
# Generating public/private rsa key pair.
# Enter file in which to save the key (/home/you/.ssh/id_rsa):

Next, you'll be asked to enter a passphrase.
# Enter passphrase (empty for no passphrase): [Type a passphrase]
# Enter same passphrase again: [Type passphrase again]
Which should give you something like this:
# Your identification has been saved in /home/you/.ssh/id_rsa.
# Your public key has been saved in /home/you/.ssh/id_rsa.pub.
# The key fingerprint is:
# 01:0f:f4:3b:ca:85:d6:17:a1:7d:f0:68:9d:f0:a2:db your_email@example.com
Install xclip
sudo apt-get install xclip
Copy your ssh key
xclip -sel clip < ~/.ssh/id_rsa.pub
Enter it to Bitbucket page. 


Then configure your Git repo.

Create folder 
$mkdir code 
$cd code
$git init
$git config --global user.email "yourmail"
$git config --global user.name "yourname"

Then add your repository
$git remote add origin git@bitbucket.org:yourname/repo.git

Add all files to git in your local repo. Then commit changes.
$git add .
$git commit -m "testing git"

Push files to Bitbucket
$git push -u origin master

If you somewhere encounter problem "fatal remote origin exists" set it again with command:

git remote set-url origin git://new.url.here