php-zalando, a PHP client to interact with the Zalando API

zalando-logoAgain it’s been a while since I created a public PHP library to interact with another API but here we go. For a project we needed to interact with Zalando but unfortunately their API didn’t have a PHP wrapper/library to interact with it. Until now!

I present you php-zalando (packagist link) based on the Zalando API.

API endpoints covered so far

  • https://api.zalando.com/article-reviews
  • https://api.zalando.com/article-reviews/{reviewId}
  • https://api.zalando.com/article-reviews-summaries
  • https://api.zalando.com/article-reviews-summaries/{articleModelId}
  • https://api.zalando.com/articles
  • https://api.zalando.com/articles/ {articleId}
  • https://api.zalando.com/articles/ {articleId}/media
  • https://api.zalando.com/articles/ {articleId}/reviews
  • https://api.zalando.com/articles/ {articleId}/reviews-summary
  • https://api.zalando.com/articles/ {articleId}/units
  • https://api.zalando.com/articles/ {articleId}/units/{unitId}
  • https://api.zalando.com/brands
  • https://api.zalando.com/brands/{key}
  • https://api.zalando.com/categories
  • https://api.zalando.com/categories/{key}
  • https://api.zalando.com/domains
  • https://api.zalando.com/facets?{filters}
  • https://api.zalando.com/filters
  • https://api.zalando.com/filters/{name}
  • https://api.zalando.com/recommendations/ {articleIds}

Nearly all API endpoints are covered, but there are still some items to do:
– achieve 100% code coverage (34% at the moment)
– add all optional filters/parameters, different per endpoint

Installation

Via Composer

Add php-zalando in your composer.json or create a new composer.json:

{
     "require": {
          "cschalenborgh/php-zalando": "dev-master"
     }
}

Now tell composer to download the library by running the command:

php composer.phar install

That’s it! Because of Composer’s autoloading you should now be able to use this library. Don’t forget to include the namespace.

Pushing Laravel logs to Loggly

logo_loggly

Laravel uses the Monolog logging library for logging, however, it’s saving all logs to a local directory by default. Not a very useful thing in a production environment.

That’s where Loggly comes into play. Loggly acts as a central (cloud)platform that can receive logs from a multitude of platforms such as php frameworks (Laravel being the one), operating systems, their own Loggly API, and a lot of other frameworks and services. In this short tutorial I’ll show you how to implement Loggly into your multi-environment Laravel project.

1) Update composer

First we need to update composer so we’re 100% that we’re using the latest Monolog version, because older versions don’t support Loggly.

composer self-update
composer update

2) Setup Loggly account

Go to https://www.loggly.com and create a (free) account.

3) Add Loggly credentials to your Laravel environment

Now the nice thing about Loggly is that it’s pretty easy to configure. Go to https://www.loggly.com/tokens and setup a token for your project. Do this to keep things seperated. Got a big project? Then you might want to use source groups as well.
Now create a new file in your Laravel folder: /app/config/services.php and add this:

return array(
	// credentials for loggly.com
	'loggly' => array(
		'key'	=> 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx',
		'tag'	=> 'ProjectName_' .strtolower(App::environment()),
	),
);

Now if you’re using Laravel’s environment configuration you should do this:

return array(
	// credentials for loggly.com
	'loggly' => array(
		'key'	=> getenv('services.loggly.key'),
		'tag'	=> 'ProjectName_' .strtolower(App::environment()),
	),
);

And then store it in a similar way in your .env.local.php file, or on Laravel Forge.

4) Install the logging code

Now this is where the magic happens. Open up /app/start/global.php and find the “Application Error Logger” section. Replace it with this code:

/*
|--------------------------------------------------------------------------
| Application Error Logger
|--------------------------------------------------------------------------
|
| Here we will configure the error logger setup for the application which
| is built on top of the wonderful Monolog library. By default we will
| build a rotating log file setup which creates a new file each day.
|
*/
 
$logFile = 'log-'.php_sapi_name().'.txt';
Log::useDailyFiles(storage_path().'/logs/'.$logFile);
 
/*
 * Setup Loggly Handler
 */
$handler = new \Monolog\Handler\LogglyHandler(Config::get('services.loggly.key'),\Monolog\Logger::DEBUG);
$handler->setTag(Config::get('services.loggly.tag'));
 
$logger = Log::getMonolog();
$logger->pushHandler($handler);

This way, every environment will have it’s own tag and you can easily see in the Loggly dashboard when something happens on your dev/staging environment, or if it’s on live. I also strongly advise to setup alerts within Loggly so you get notified once you got a couple of 500 errors coming around.
Now the nice thing about Loggly is that you can go back in time and easily see in the stats how many errors occured in the past 24 hours, good luck doing that with log files! This is especially useful for keeping a close eye on cronjobs and incoming API calls.

5) Testing

Throw some 404 errors by opening up a page in your project and appending some random characters. Wait a couple of minutes and there we go. You should see something like this:
Screen Shot 2014-10-02 at 11.05.36

Also worth noting is that all other Laravel logging functions will also push to Loggly.

More info about Laravel’s logging features: http://laravel.com/docs/4.2/errors#logging
More info about Loggly: https://www.loggly.com

php-invoiceocean, a PHP client to interact with the InvoiceOcean.com API

It’s been a while since I created a public PHP library to interact with another API but here we go.

php-invoiceocean is a PHP client to communicatie with the InvoiceOcean.com API.

What is InvoicingOcean?

“The easiest way to online invoicing”. Over 70 000 companies are using the InvoiceOcean software. The application’s simplicity and intuitive interface is aimed at quick and efficient invoice issuing. Because of the SaaS environment, your data is securely stored in the Cloud and available to access from anywhere in the world. Whether you are a small or medium business owner or an individual entrepreneur, InvoiceOcean will make your work easier.

A few facts

– it’s RESTful
– works with Json exclusively (although they have XML api’s, I like Json more)
– it’s pretty smart in a way that you only have to define the method names, no parameter or http method checking
– works with composer (obviously)

Clone this repo @ https://github.com/kryap/php-invoiceocean

How to use?

composer require kryap/php-invoiceocean
./composer.json has been updated
Loading composer repositories with package information
Updating dependencies (including require-dev)
- Installing kryap/php-invoiceocean (dev-master a8ccf4d)
Cloning a8ccf4dfd3b8daa611087822f27da2a773c073ce
 
Writing lock file
Generating autoload files
Generating optimized class loader

Here’s how you could use it:

<?php
 
$io = new InvoiceOceanClient('username', 'api_token_goes_here');
$clients = $io->getClients();
 
var_dump($clients);
 
?>

Need more documentation about the InvoiceOcean API?

Visit these links:

Behind the scenes @ Coolblue HQ

What is Coolblue?

Last week I was invited to go visit Coolblue behind the scenes in Rotterdam, NL. Coolblue has always been one of my favorite online dutch/belgian webshops because of their great customer service, their clear but simple websites, and most important, their great range of products at very affordable prices.

They roughly have about 50k unique products in 150-200 different webshops (they launch 2 new webshops per week on average), which they all have in stock at all time in one of their huge warehouses. Their biggest warehouse is more than 40.000 square meters which is about 10-20 FIFA football fields! How big is that! Ordering before 23.59 means it’s delivered at your doorstep for free the next day, or the same day when paying additional shipping costs. Also on sunday!

Why would they organise behind the scenes events?

The main idea behind this ‘Behind the scenes’ is that they are looking for talent programmers. They’re looking for roughly 100 new developers. They have about 40 right now so it seems they have big plans. Are they going international? Who knows! But their CEO didn’t really deny it. Their global employee basis is about 750 people.

Since I’m a freelancer I won’t be able to work for them so why did I go to this event? Well.. because their main dev talk was about scalability. I find this a very interesting and challenging topic so getting insights from one of the biggest webshops in the Benelux was a great opportunity for me to learn.

So what did I learn at Coolblue?

  1. use microservices / hypermedia api’s to flatten out your infrastructure. This allows you to easily isolate bugs and/or maintain those API’s without having to retest stable services on every deploy.
  2. every microservice uses an isolated datastore. CouchDB for customer data, PostgreSQL for order+payment information, ElasticSearch for the product catalog.
  3. RabbitMQ as a central state change notification mechanism. This is the glue between your microservices.
  4. use git pull requests to review & validate your team members changes
  5. CentOS RPM packages to distribute everything
  6. Puppet labs or Chef to install & maintain your (virtual) servers
  7. Statsd + Graphite for advanced reporting
  8. Nagios for alerting
  9. Create a “Chaos Monkey”. It’s single purpose is to once in a while kill your live environment. Your devs should then come with solutions to auto-fix these downtimes. This is how Netflix stayed online during the massive AWS outage a while ago: http://techblog.netflix.com/2012/07/chaos-monkey-released-into-wild.html

Have a look at the slides from their software architect.

Why should you attend one of these meetings?

  1. If you want a developer job @ Coolblue obviously.
  2. If you want to learn more about scalability.
  3. If you want to network with hundreds of other devs.

Inform yourself on this page: http://www.coolblue.nl/behindthescenes

Here’s a nice dutch article about the same event: http://www.dailybits.be/item/coolblue-behind-scenes/

Raspberry Pi PLC/Domotica testcase

Here’s a quick tutorial on how to build a hardware on/off switch which sends this signals to a RESTful web API using Raspberry PI with Raspbian. This is in fact a small PLC testcase (proof of concept). The possibilities are in fact endless!
I’m planning to use this to monitor certain events around my house. E.g. is a door open/closed? Is a device on/off?

Download & install Raspberry

Download the latest version of Raspbian onto your Raspberry PI SD card:
http://www.raspberrypi.org/downloads

Updates & depencies

Do some updates + install extra depencies:

apt-get update
apt-get install python, py-pycurl

Setup the hardware

In order to know the GPIO’s pins you’ll have to find the input/output pins. Here’s a map:

raspberry-gpio
Connect your Raspberry’s GPIO (the big black serial thing) to some switch or toggle. Here’s how I did it (testcase):

raspberry-pi-gpio-switch

We actually need 3 pins. One for I/O, one for power, and one for grounding (safety first!). Make sure you solder the right cable to the right GPIO pin (see map above).

Now you might experience the naming of these pins are confusing. That’s because there’s 3 type’s of naming conventions used here..

Pin Numbers RPi.GPIO Raspberry Pi Name BCM2835
P1_01 1 3V3
P1_02 2 5V0
P1_03 3 SDA0 GPIO0
P1_04 4 DNC
P1_05 5 SCL0 GPIO1
P1_06 6 GND
P1_07 7 GPIO7 GPIO4
P1_08 8 TXD GPIO14
P1_09 9 DNC
P1_10 10 RXD GPIO15
P1_11 11 GPIO0 GPIO17
P1_12 12 GPIO1 GPIO18
P1_13 13 GPIO2 GPIO21
P1_14 14 DNC
P1_15 15 GPIO3 GPIO22
P1_16 16 GPIO4 GPIO23
P1_17 17 DNC
P1_18 18 GPIO5 GPIO24
P1_19 19 SPI_MOSI GPIO10
P1_20 20 DNC
P1_21 21 SPI_MISO GPIO9
P1_22 22 GPIO6 GPIO25
P1_23 23 SPI_SCLK GPIO11
P1_24 24 SPI_CE0_N GPIO8
P1_25 25 DNC
P1_26 26 SPI_CE1_N GPIO7

Anyway, let’s move on and try & catch the GPIO’s input using python.

Read GPIO signals using Python

plc.py (daemon script)

import RPi.GPIO as GPIO
import time
import os
 
buttonPin = 07
GPIO.setmode(GPIO.BOARD)
GPIO.setup(buttonPin,GPIO.IN)
 
while True:
  if (GPIO.input(buttonPin)):
    os.system("sudo python /home/pi/plc_handle.py")
    #print "button called"

plc_handle.py

import time
import RPi.GPIO as GPIO
import datetime
import pycurl, json
 
buttonPin = 07
GPIO.setmode(GPIO.BOARD)
GPIO.setup(buttonPin,GPIO.IN)
 
# reset state
last_state = -1
 
while True:
  input = GPIO.input(buttonPin)
  now = datetime.datetime.now()
 
  # check if value changed
  if (input != last_state) :
    	print "Button state is changed:",input, " @ ",now
	api_url = "webserver.com/api/input.php"
	data = "location_id=1&amp;status=%s" % input
	c = pycurl.Curl()
	c.setopt(pycurl.URL, api_url)
	c.setopt(pycurl.POST, 1)
	c.setopt(pycurl.POSTFIELDS, data)
	c.perform()
 
  # update previous input
  last_state = input
 
  # slight pause to debounce
  time.sleep(1)

You can run this script doing this:

sudo python /home/pi/plc.py

Or add it to /etc/rc.local (so it runs after each reboot)

python /home/pi/plc.py
exit 0

Web API

Here’s a quick (and unsafe) ‘API’ script for receiving the Raspberry signals:
input.php

<?
header('Cache-Control: no-cache, must-revalidate');
header('Expires: Mon, 26 Jul 1997 05:00:00 GMT');
header('Content-type: application/json');
 
$dbh = new PDO('mysql:host=localhost; dbname=database', 'username', 'password');
 
$response = array(
    'status'    => 'nok'
);
 
if(!Empty($_POST['location_id']))
{
    $status = $_POST['status'];
    $location_id = $_POST['location_id'];
 
    // create log
    $sql = "INSERT INTO status_log (location_id, status, created_at, updated_at) VALUES (:location_id, :status, NOW(), NOW())";
    $q = $dbh->prepare($sql);
    $q->execute(array(':location_id' => $location_id,
                      ':status'      => $status));
 
    // update location  
    $sql = "UPDATE location SET status=:status, updated_at=NOW() WHERE id=:location_id";
    $q = $dbh->prepare($sql);
    $q->execute(array(':location_id' => $location_id,
                      ':status'      => $status));
 
    // output
    $response = array(
        'status'    => 'ok'
    ); 
}
 
echo json_encode($response);
?>

 
Now I’m very curious what sort of applications you guys are building with this Raspberry Pi “plc implementation”. Feel free to post them in the comments section.

Usefull links

 

Popular websites 10 years ago

Here’s some screenshots of popular websites (Linkedin, Facebook, Google, Twitter, eBay, YouTube, Wikipedia, Amazon, Hotmail, Blogger, Apple), but 10 years ago.. Pretty neat how things evolved right? Who can still remember this?

websites

PHPBenelux Conference 2013

My first ‘official’ post (well actually my second post after Hello World) was a review of the PHP Benelux 2012 conference so why don’t we make a tradition out of it?
I must say, the conference is getting better and better each year with lots of interesting speakers, tutorials, visitors, and of course the enjoyable socials with lots of free Belgian beer!

PHP Benelux 2013 after movie:

And of course the by now famous stressball fight:

Facebox and remote ajax pages solution

While facebox allows you to load remote pages via ajax, it doesn’t allow you to load ‘real remote’ pages from a different domain due to cross-domain policy. This is where jsonp comes in place.

Here’s a fix for facebox (current version 1.3).
Open up facebox.js and search for the ‘fillFaceboxFromAjax’ function. Now replace

$.get(href, function(data) { $.facebox.reveal(data, klass) })

with

$.ajax({
   url: href,
   dataType: 'jsonp',
   error: function(xhr, status, error) {
      alert(error);
   },
   success: jsonpCallback
});

And add a callback function:

function jsonpCallback(response){
   $.facebox.reveal(response.data, '');
}

Now this will only work if you also modify the remote file you’re calling. Make sure it will look something like this:

header('content-type: application/json; charset=utf-8');
$data = array(
   'data' => $generatedoutput,
);
$json = json_encode($data);
 
echo isset($_GET['callback']) ? "{$_GET['callback']}($json)" : $json;

Where $generatedoutput is your actual data (in this case I use ob_start and ob_get_contents for ease).
$_GET[‘callback’] is a parameter which is added automaticly by jquery. Read their docs for more info: http://api.jquery.com/jQuery.ajax/