Saturday, November 26, 2011

Interacting with Django Tastypie API Using Python Librarys

If you already have some django web-app, consider extending its functionality with django-tastypie. It has pretty decent docs as well. With django-tastypie you can create RESTful Web Service. I am not going to talk about django-tastypie in this post, instead I would like to share how to interact with django-tastypie api using python's urllib and urllib2 libraries.

The following codes gives some idea about how to get data using the API, with ApiKeyAuthentication method.

import urllib, urllib2

api_url = 'http://localhost:8080/api/app1/model1/?username=apiuser1&api_key=apikey_for_apiuser1'
http_req = urllib2.Request(api_url, headers={'ACCEPT': 'application/json, text/html'})
try:
    resp = urllib2.urlopen(http_req)
    contents = resp.read()
    print contents
except urllib2.HTTPError, error:
    print he.getcode()


The codes below gives some idea about how to create data using the API, with ApiKeyAuthentication method.


import urllib, urllib2
import simplejson as json


api_url = 'http://localhost:8080/api/app1/model1/?username=apiuser1&api_key=apikey_for_apiuser1'
json_post = json.dumps({"model_name": "TEST13"})
http_req = urllib2.Request(api_url, json_post, {'Content-Type': 'application/json', 'ACCEPT': 'application/json, text/html'})
try:
response = urllib2.urlopen(http_req)
print response.read()
print response.getcode()
print response.info()
print response.msg
except urllib2.HTTPError as he:
print he.getcode()
print he.info()
print he.msg
except urllib2.URLError as ue:
print ue

Both examples using json data format. Remember to set the appropriate Content-Type and ACCEPT header data if using different data format.
If it is necessary to use https secure connection and proxy, refer to this post to get more information.


Distributing Large Files - Implementation Part 1

This post is related to my previous post about distributing large files http://murzal-arsya.blogspot.com/2011/11/distributing-large-files.html.

Finally I implemented my distribution strategy as I explained in my previous post. Let me give a little bit info about the infrastructure.

1 Server:
Brand/Type: IBM System x3650 M3
OS: Centos 6
Service: Samba
NIC: 2 Gigabit

1 Switch: HP Procurve 1810 24g (Gigabit)

10 Clients:
Brand/Type: HP Pavilion HPE
OS: Windows 7
Apps: QGIS
NIC: 1 Gigabit


I have samba configured as shared storage. Within the share folder I put the large image files to be distributed to the clients. My strategy is creating a chain distribution, from the Server to PC-01, PC-01 to PC-02, and so on, until PC-10. I choose robocopy for its simplicity and it does the job well.
So, I created a batch file

Since I need to access the shared folder, I need to make sure that the connection could be established using correct user for authentication. In order to do that, before I run robocopy, I use net use command for authenticating.
>net use \\SERVER\share_images /USER:user1 password1
then I could execute robocopy with something like the following:
>robocopy /E /PURGE /COPY:DAT \\SERVER\share_images C:\images

The most ideal way to trigger the synchronization is when there is change in each share folder, and queuing the sync processes when there are multiple changes detected. It is beyond the scope of this post.

Sunday, November 20, 2011

Python HTTPS POST Data Over Proxy

Posting HTTPS POST data over proxy is pretty simple in python. Everything is done mostly with urllib2. The only thing that is done outside urllib2 is when encoding the POST data, which uses urllib.urlencode.
Here is an example:


import urllib, urllib2

PROXY = urllib2.ProxyHandler({'https': 'http://your_proxy:your_proxy_port/'})
PROXY_OPENER = urllib2.build_opener(PROXY)
urllib2.install_opener(PROXY_OPENER)

URL = 'https://www.post-test-site.com/post.php'
POST = {'data1' : 'Your Data1', 'data2' : 'Your Data2'}

POST_DATA = urllib.urlencode(POST)
HTTPS_REQ = urllib2.Request(URL, POST_DATA)
RESP = urllib2.urlopen(HTTPS_REQ)
OUTPUT = RESP.read()

print OUTPUT

When your proxy needs authentication, then you need to use:
urllib2.HTTPBasicAuthHandler and/or urllib2.HTTPPasswordMgrWithDefaultRealm

Tuesday, November 15, 2011

Configure DHCPd to Work on Certain NIC(s)

Your server has more than one network interfaces and you need DHCP server to associated with only one of them.

Here is how you configure it on Ubuntu:

# File: /etc/default/dhcp3-server
INTERFACES="eth0"
or
INTERFACES="eth0 eth1"

Onn Redhat/Fedora/Centos:
# File: /etc/sysconfig/dhcpd
DHCPDARGS=eth1
or
DHCPDARGS="eth0 eth1"

Reference:
http://www.centos.org/docs/5/html/Deployment_Guide-en-US/s1-dhcp-configuring-server.html

Saturday, November 12, 2011

Distributing Large Files

One of the requirements of my new project is to be able to distribute large files to 10-15 clients as fast as possible. Fortunately, all (server and clients) NICs supports Gigabit connectivity.
I tested it first simply by copying large data from server to many clients simultaneously. Of course, it was a disaster. While theoretically the copying speed would decrease proportional to the number of clients, in reality the network connection between clients and server, and also between clients got disrupted. Network was down for sometimes. I did not make any further investigation on what was happening, all I know is this is far from the desired speed even if it was going well.
What's next? I gather some info's on how to distribute large files over LAN. Some suggest bittorrent, parallel filesystem, multicast copy, (win) mesh, and some other way. My concern is that I have to use a solution which is as simple as possible, since there would not be any expert IT guy around in the future.
While bittorrent sounds pretty promising, I had to leave out this option. Not only because of the capability of the human resources that will maintain the system, also it would take some times to prepare the large file to be distributed by bittorrent. Another thing is that small torrent packets could really saturate the network.

About parallel filesystem, I have to ditch this one too, because this would be more complicated and confusing for the basic admin guy. Also I need to mention that all clients are Windows 7, not really sure there are stable parallel filesystem support

Multicast copy should be interesting, but could not find any apps for Windows 7.

So, I figured that I need to make my own simple solution. The goal here is to distribute large files, around 1-5 GigaBytes each, without overloading the connection between source (server) and target (pc).

I simply came up with creating some kind of chain distribution. Transfer file from Server to PC1, and from PC1 to PC2, and so on. Doing this, I could use all of the available incoming gigabit bandwidth as well as the outgoing one. I tested it and worked quite well. Of course there will be a problem if one of the PC is down, then the distribution could not be continued to the rest of the PCs after that. So, rerouting must be done either manually or automatically.

As for the tool to copy the data, one can use simple copy, robocopy, synctoy, and other mechanism. I am planning to use command line robocopy.

I also plan to use load balancing technique using both NICs on the server. So, I can create 2 chains within the network, and perform the transfer simultaneously to those chains.