Yearly Archives: 2015


Backing up a Laravel site to Amazon S3 with laravel-backup

Installing Laravel-backup

For this activity I’m going to be using the Laravel-Backup tool from Spatie

https://github.com/spatie/laravel-backup

Follow the instructions on the github readme.

  1. composer require spatie/laravel-backup
  2. add service provider to app.php  Spatie\Backup\BackupServiceProvider::class,
  3. publish the config file. This adds a new laravel-backup config item to the config folder
php artisan vendor:publish --provider="Spatie\Backup\BackupServiceProvider"

To start with, we will test the backup using the local file system, as comes enabled  by default.

You now have a couple of additional functions in Artisan

backup-commands

Try php artisan backup:run

If it returns an error then it’s probably because your system cannot find the needed mysqldump command. If this is not found then you will have to track it down on your machine and then adjust the myqsl dump_command_path in the laravel-backup.php file.  Make sure the path ends in a forward slash as the mysqldump command will be appended to it.

'mysql' => [
 /*
 * The path to the mysqldump binary. You can leave this empty
 * if the binary is installed in the default location.
 */
 'dump_command_path' => '/Applications/MAMP/Library/bin/',

When using the default Local storage, the backup will be in the storage/app/backups folder. At this point, you should test the backup files

Configuring S3 to receive the files

Assuming that you have an AWS S3 account with Amazon.

First of all create the bucket that will be used. I recommend a separate bucket for each site since that allows you to secure them individually.

Next, create an IAM identity for the bucket. Identity and Access Management is an identity management solution and will prevent the credentials stored on this website from accessing other backups.  The last thing you want is an intruder on one site accessing the backups for other sites since these backups will contain access credentials for those other sites.

  1. Select create user
  2. Enter a name for the user (the name of the site perhaps)
  3. Copy the access credentials. These will be used to configure Laravel Flysystem in a moment.

Select the User and click Permissions then Inline Policies

Select Create One, then Custom Policy

Provide a policy name (no spaces)

Add the policy as below, adding the name of your new bucket

{
    "Version": "2012-10-17",
    "Statement": [
    {
        "Effect": "Allow",
        "Action": "*",
        "Resource": [
            "arn:aws:s3:::your-bucket-name",
            "arn:aws:s3:::your-bucket-name/*"
         ],
         "Condition": {}
     } 
     ]
}

Configure Laravel Flysystem

Edit config/filesystems.php

Under disks->s3, change the following so that the S3 keys can be picked up from .env and not end up in your repo.

 'driver' => 's3',
 'key' => env('S3_KEY'),
 'secret' => env('S3_SECRET'),
 'region' => env('S3_REGION'),
 'bucket' => env('S3_BUCKET'),

Then set the .env file with the actual values

S3_KEY='AKI****F2CH4****PFKQ' #your access key
S3_SECRET='kEjL********r3r+4QjkbU********NQIiiEfhb' #secret access key
S3_REGION='eu-west-1'
S3_BUCKET='your-bucket-name'

Next tell laravel-backup to use S3 (config/laravel-backup.php)

/* 
* The filesystem(s) you on which the backups will be stored. Choose one or more 
* of the filesystems you configured in app/config/filesystems.php 
*/ 
  'filesystem' => ['s3'],

and set the folder in which to store the backup

 /*
 * The path where the backups will be saved. This path
 * is relative to the root you configured on your chosen
 * filesystem(s).
 *
 * If you're using the local filesystem a .gitignore file will
 * be automatically placed in this directory so you don't
 * accidentally end up committing these backups.
 */
 'path' => 'backup',

Install S3 library

The S3 libraries are not shipped by default so you will need to add these via composer

composer require league/flysystem-aws-s3-v3 ~1.0

Test so far…

You should run the backup again and hopefully your files will be pushed to S3, which you can inspect through the S3 file browser

 

Configuring CRON to run the job

Configure the server to call the Schedule:run artisan command every minute. This is covered in the Laravel docs.

If your host does not support CRON, then A suggestion is made in an earlier blog post.

Setup entries in your Http/kernel.php file;

protected function schedule(Schedule $schedule)
 {
    $schedule->command('backup:run',['--only-files' => '','--suffix' => '_files'])
        ->weekly()->mondays()->at('03:00')
        ->description('My-project Files backup')
        ->sendOutputTo('storage/logs/backup.log')
        ->emailOutputTo('mark@novate.co.uk')
        ->before(function(){
            Log::info('Commencing Files Backup');
        })
        ->after(function(){
            Log::info('My-project Files backup complete');
        });

    $schedule->command('backup:run',['--only-db' => '','--suffix' => '_db'])
        ->twiceDaily(2,14)
        ->description('My-project Database backup')
        ->sendOutputTo('storage/logs/backup.log')
        ->emailOutputTo('mark@novate.co.uk')
        ->before(function(){
            Log::info('Commencing Database backup');
        })
        ->after(function(){
            Log::info('My-project Database backup complete');
    });
 }

So here, I have two backup jobs, one running once per week for all the files, and then a twice-daily database backup.  Following each, the log of the backup is sent via email.

Summary

This has been a long-winded setup as there are multiple steps.  Laravel-backup is a very flexible backup solution and leverages league\flysystem to store backups to the cloud.

Using Amazon S3 and protecting it with IAM provides a robust destination for your backups.


Weekly notables August 9th 2015

Awesome list of Laravel related links at getawsomeness.com

Hacking with PHP is where I found inspiration for the use of flock http://www.hackingwithphp.com/8/11/0/locking-files-with-flock

BBC Bloggers publish 13 tips for making responsive web design multi-lingual

I use Google Authenticator for Gmail, Lastpass and Digital Ocean – keep an eye on this one A One Time Password Authentication package, compatible with Google Authenticator.

One of my favourite YouTubers Travis Neilson shares his favourite tools http://travisneilson.com/workflow-tools/


Using a file lock to stop cron jobs updating the same record

Recently I was faced with an issue of deliberately overlapping cron jobs both trying to work on the same database record.

After some research, using flock (file lock) seemed to be a good option.

At each minute, I trigger a job to update the database with information scraped from another site (public domain information – before you ask). I let each job run for just under 3 minutes, so that at any time there are 3 jobs running. If I want to push it further I can just up the maximum time for each job.

The project is in Laravel and uses Eloquent for ORM and mysql for the database.  The problem faced was that the query to find out which record to service could take a relatively long time so that two processes could come to the same answer about which was the next record to update.

//open a lock file - can be used to pause other processes when they are also trying to query db
$lockfile = fopen(storage_path('locks/operatorUpdate.lock'),"w");

while(microtime(true)-$time_start < 178) {

    flock($lockfile, LOCK_EX);

        // establish oldest record
        $operator = \App\Operator::orderBy('checked_at','asc')->orderBy('id','asc')->first();
        $operator->checked();

    flock($lockfile, LOCK_UN);

    // do what I need here to process the record just grabbed

}
fclose ($lockfile);

I decided to keep the lock file in a storage folder called locks. I used the storage_path() call to ensure that the path is the same irrespective of how I called the function. (cron jobs default to the root user home folder)

Bear in mind that I’m placing a blocking lock on the file just whilst I grab the oldest record and set its checked_at date. A second job coming along at the same time will hit the lock and wait for the quarter second it takes for the original query to run (I have 280,000 rows in the table).


Supporting delete cascade with SQLite and Laravel

If using SQLite, it is useful to be able to cascade a delete to related models. For instance, if a user is deleted, all their posts should also be deleted rather than being orphaned.

In the schema for the pivot table, you specify;

    $table->integer('post_id')->unsigned();
    $table->foreign('post_id')->references('id')->on('posts')->onDelete('cascade'); 

    $table->integer('user_id')->unsigned();
    $table->foreign('user_id')->references('id')->on('users')->onDelete('cascade'); 

This works out of the box for mysql, but with sqllite it is not supported without turning it on.

I had this issue, and created a workaround, but I’m not comfortable with the solution because it required me to change the Laravel source. I’m only a newbie so could not really see an ‘app’ way of doing it.

In config/database.php

    'sqlite' => [
        'driver'   => 'sqlite',
        'database' => storage_path().'/database2.sqlite',
        'prefix'   => '',
        'exec'	   => 'PRAGMA foreign_keys = ON;',  //enable delete cascade
    ],

I added a new element ‘exec’

Then in /vendor/laravel/framework/src/Illuminate/Database/Connectors/SQLiteConnector.php, replace;

    return $this->createConnection("sqlite:{$path}", $config, $options);

with

    $pdo=$this->createConnection("sqlite:{$path}", $config, $options);

    //any exec statement?
    $exec = array_get($config, 'exec');
    if(isset($exec))
    {
        $pdo->exec($exec);
    }
    return $pdo;

This allows the foreign_keys property to be set each time the connection is opened, and also any additional exec statements that might be needed.


Deploy Laravel 5 on shared hosting from Heart Internet

For trial sites, and quick to deploy, low traffic tools, its perfectly possible to host your site at Heart Internet using subdomains. Although these instructions are specific to Heart, they will work for other hosts, with and without subdomains.

Wait.....
Before you do anything – check that your host is providing PHP V5.4 or better (Laravel 5.0) or PHP 5.5.9 or better (Laravel 5.1 / 5.2)

1. Request subdomain setup

Heart run their subdomains on the same server. A folder is created in the public_html folder for the subdomain. For instance, I’m creating a service that will respond to dj3.mydomain.com  on the mydomain.com server, there will be a folder called public_html/dj3

After requesting the subdomain, wait an hour for the DNS to all be in place.

2. Upload your site

Your laravel code base should be located in a folder that is not accessible from the web.

Create a new folder in your root folder based on the name of your subdomain.  This is incase you want to install another application, you can put each backend in its own space.  Here i have used the name dj3core

dj3 directory

FTP everything except your public folder into the back-end folder that you created (dj3core in my example)

FTP the contents of your public folder into the subdomain folder (dj3 in my example)

Make sure that you copy the hidden file .htaccess also into your subdomain folder.  Do not put it in the root or the public_html folder

3. Fix the paths in the index.php file

You need different paths in the index.php file to what you have probably been testing with, so before uploading or inplace on the hosted server, edit the index.php (the one in the subdomain, eg /public_html/dj3/index.php) file as follows;

/*
|--------------------------------------------------------------------------
| Register The Auto Loader
|--------------------------------------------------------------------------
|
| Composer provides a convenient, automatically generated class loader for
| our application. We just need to utilize it! We'll simply require it
| into the script here so that we don't have to worry about manual
| loading any of our classes later on. It feels nice to relax.
|
*/

// require __DIR__.'/../bootstrap/autoload.php';
require __DIR__.'/../../dj3core/bootstrap/autoload.php';

/*
|--------------------------------------------------------------------------
| Turn On The Lights
|--------------------------------------------------------------------------
|
| We need to illuminate PHP development, so let us turn on the lights.
| This bootstraps the framework and gets it ready for use, then it
| will load up this application so that we can run it and send
| the responses back to the browser and delight our users.
|
*/

// $app = require_once __DIR__.'/../bootstrap/app.php';
$app = require_once __DIR__.'/../../dj3core/bootstrap/app.php';

I have retained the original lines, and added the modified lines below.

Compared to the distribution, the application can be found up two directories (../) and in the dj3core folder.

Thats it! Your site should now be working in the subdomain dj3.mydomain.com

4. Problems?

If you are still having problems, check that the storage folder is writable.

At the time of writing, I have not tested email, but I don’t expect there to be a problem.

If you are using the HTML and URL helpers make sure the url is set correctly in the config/app.php file.

 


Help, my host does not support CRON jobs

If you are working on a shoestring and using a shared host for your latest Laravel 5 project, you may want to setup some scheduled jobs. Laravel 5 has a great scheduler built in, but it needs a kick every minute for it to determine if it is time to run the job.

Through a third party service such as cron-job.org it is possible to provide this kick to the Laravel 5 Scheduler.

1. Create an account at cron-job.org

Accounts are free and permit you to schedule a task as frequently as once per minute

2. Create a route in your application to kick the laravel scheduler

    //trigger the scheduler
    Route::get('/hshhdyw7820037lammxh29' , function(){
        Artisan::call('schedule:run');
        return 'OK';
    });

Here I have used a random string for the path so that it is not accidentally ‘found’. If it would be an issue if your task is triggered twice, you might want to protect it further such as checking for the request coming cron-job.org’s IP address.

3. Add this route to the cron-job.org schedule

cron-tab

 

Other thoughts

By triggering the scheduler this way rather than just running the job directly means that you can then use the power of the Artisan scheduler.  Check out Eric Barnes intro to using the scheduler.

One thing not covered by Eric or the documentation is the ability to run a task every few minutes (5 minutes is catered for).

This example runs the ReplayServiceProvider every two minutes.

    $schedule->call('App\Providers\ReplayServiceProvider@feedData')->cron('*/2 * * * *');

Laravel 5 checkbox processing

One of the annoyances of HTML form processing is that checkboxes are not returned if they are unchecked.

This causes an issue if you just want to use Laravel’s automatic validation of forms and then want to be able to pass the validated form response to the model. Whilst it is possible to manage checkboxes in the controller it always strikes me as messy. My solution is below. There will always be detractors that claim the validator is not the place for this, my argument is that I am validating that what comes from the validator is either true or false and not true or missing.

Since the rules area of the request object is actually a method, it is possible to interact with the content of the request.

So, in my EditUserRequest class, where I have a checkbox named ‘is_admin’;

	public function rules()
	{
		// default the value of the is_admin checkbox
		$this->merge(['is_admin' => $this->input('is_admin', 0)]);

		return [
			'name' => 'required|min:5',
			'email' => 'required|email',
		];
	}

I merge back into the request, the value of the input, or a default (the second option to Request->input) of 0. This sets the checkbox element to 0 if it is not present.

Then in the controller, I can use the simple;

		$user->update($request->all());

Laravel 5 csrf tokens in ajax calls

In Laravel 5, all requests must pass through the Middleware which will not allow any POST requests without the correct CSRF token.

CSRF (Cross Site Request Forgery) prevents the site receiving requests from clients that it has not established a connection with. IE a random post request from a third party.

When using ajax to post form or changes in state, the csrf token must be supplied along with the request.

For instance, if the view being rendered contains the javascript, simply use blade tags to insert the token directly into the script:

                $.ajax({
                      type: "POST",
                      url: "/poke",
                      data: {   lat: lastlat,
                                lng: lastlng, 
                                bearing: 90,
                                '_token': '{!! csrf_token() !!}'
                            }
                    })

If the javascript is in a separate file (not processed by Blade) then the token can be set on a meta element and then queried by jQuery at runtime.

<meta name="csrf-token" content="{!! csrf_token() !!}">

Putting the above in the master page layout ensures that the csrf token is available in every page

Referring then to the meta element in each javascript ajax request;

                $.ajax({
                      type: "POST",
                      url: "/poke",
                      data: {   lat: lastlat,
                                lng: lastlng, 
                                bearing: 90,
                                '_token': $('meta[name="csrf-token"]').attr('content')
                            }
                    })

Thanks to Kelt Dockin for inspiration