PrestaShop Variable Shipping Carrier

PrestaShop Variable Shipping

Sometimes complex shipping and handling rules make it impossible for customers to place orders in PrestaShop. Lack of instant shipping quotes, huge quantities and large weights, limited shipping addresses, etc. And while PrestaShop administrators are able to create orders manually via the back-end, these manual orders are still subject to the same shipping carrier rules.

Continue reading



Consuming Hidden WCF RIA Services

A Silverlight application made it to my desk yesterday, an application that consumed a remote WCF RIA service running on a Microsoft IIS. The service did not provide a public API, nor did disassembly with dotPeek help get the service manifests to construct a WCF client with. WSDL files weren’t exposed either. A new, custom client was to be written by reverse engineering what was available without any fancy configurations.

Consuming Hidden WCF RIA Services

Continue reading



My very own standing desk

Today I finally got my custom-made standing desk installed. It’s a 2-in-1 actually, with surfaces for both standing and sitting modes. It’s quite compact, occupies far less space than my older workspaces. But enough talk, here are some pictures instead:

This slideshow requires JavaScript.

I’ll post an update in a week or so with how it feels to work upright. Haven’t had the chance to wrap my head around the experience in such little time, but I feel that my typing speed and accuracy have increased a bit. And a sense of utter freedom of movement – I can now kick, punch (both the air and the wall around the desk) and stretch anytime and even jog in place while I wait for compilation, downloads/uploads, chat responses, etc. or simply pause to think.

Have you tried using a standing desk? What benefits did you discover?



Bulk Reports and Digests for Gravity Forms

Bulk Reports and Digests for Gravity Forms

I have written yet another Gravity Forms plugin/addon. This time the plugin was to generate bulk reports for form entries, digests of sorts. Based on a set schedule (which can be altered using the cron_schedules filter), this addon will aggregate all new form entries it hasn’t seen yet (including very old ones) and send them out to predefined e-mail addresses.

The whole thing works best with regular single-shot notifications turned off, probably.

Download it from github now.



A new, much better home for my code

I have been having trouble with my former low-end VPS provider after two years of quite stable service. They decided to move data centers, and my OpenVZ box ended up being corrupted during the move. I have been looking to move for quite a while now. First of all I started using Arch Linux a while ago, and have been enjoying effortless rolling updates and upgrades every day. My former server was running Ubuntu 10.04 for 2 years, and due to fear of breaking it during updates (yes, it happens more often than one might think) I was stuck with some pretty old libraries, and although I managed to compile PHP and ngnix every six months or so to stay up to date, other newer packages required newer libraries, which in turn required a new kernel, etc.

So I was looking for a VPS provider with Arch images. Amazon AWS is quite expensive, although Arch Linux AMI images are available from Uplink Labs. But besides that, I’ve also been looking to switch to XEN virtualization, to have guaranteed memory, the power of swap and other advantages over OpenVZ and Virtuozzo offered by many companies.

After having tried out several alternatives on the low-end market, it’s been nothing but headaches, for the past month. So I decided to go for a safe, proven and mainstream provider – Linode. Fit my criteria of carrying Arch images (1.8% of deployments are Arch on Linode), XEN virtualization, quite low-end and budget-friendly, 2TB of data transfer, and promised effortless upgrades. The only downside was their lack of support for PayPal payments (very probably justified). So I had to get a prepaid virtual card.

So, as of a couple of days ago, the new home for my dozen of sites, and repositories is a blazingly fast XEN Arch Linux box at Linode. I’m quite sure I wont’ be disappointed.

What have you tried? What do you use now?



Resigning Tampered Android APKs

After tampering with a signed apk using tools like smali/baksmali or even apktool here are the steps to rebuild and resign the Android application (from application root):

keytool -genkeypair -alias androiddebugkey -dname 'CN=Android Debug,O=Android,C=US' -keystore /tmp/debug.keystore -keyalg RSA -validity 10000 generate a valid Android debug keypair (Signing in Debug Mode) with password ‘android’ for both the keystore and the keys

rm -rf META-INF if such exists

zip -9 -r out-unaligned.apk . to zip things up

jarsigner -sigalg MD5withRSA -digestalg SHA1 -keystore /tmp/debug.keystore out-unaligned.apk androiddebugkey sign it

zipalign 4 out-unaligned.apk out.apk align it

keytool -printcert -jarfile out.apk check it

adb install -s out.apk install it (you may need to uninstall a previous version of the application in case of certificate errors



tail -f | event

Monitoring log files for specific keywords and firing off an event turns out to be quite simple to accomplish in bash with a `while` loop.

#!/bin/bash

tail -f $1 | while read line; do
    line=`echo -n "$line" | grep -i "$2"`
    if [ -n "$line" ]; then
        # mate-notify-send -t 0 "$2 has been logged"
        echo "$2 has been logged" | mail -s ...
    fi
done

Something I’ve been using quite a bit lately expecting keywords to show up in various local and remote logs (ssh ... "tail -f ..."). What log event monitoring tools do you use?

Also, since this is the second time I decided to share a bash snippet quickie and have received some improvement feedback on my first one I’ve created yet another “bash-utils” repository. Feel free to chime in.



Monitor Directory for Changes

Here’s a simple script that I setup for my development WSGI server to reload itself once changes in source code are detected:

#!/bin/bash

while true; do
    A=`find $1 -printf '%t' | md5sum`;
    sleep 1
    B=`find $1 -printf '%t' | md5sum`;
    if [ "$A" != "$B" ]; then
        echo "Detected change, doing: $2"
        eval $2
    fi
done

It’s very simple (a poor-man’s replacement for inotify) and doesn’t do anything complicated. Usage ./monitor.sh application "my-reload-services.sh". You can filter out unwanted stuff like maybe *.swp files by referring to the man find pages.

What do you use to monitor for changes? How can the above script be improved?



Dubugging Flask applications under uWSGI

Flask comes with a fantastic debug mode available with the built-in server, but is advertised as unusable under uWSGI, due to some forking limitations, which I couldn’t understand. There is a way that allows you to spawn the development debug mode in Flask (and Werkzeug) regardless of what everyone around seems to say.

Before the if __name__ == '__main__': part of your application, i.e. at the very end, you have to wrap a werkzeug.debug.DebuggedApplication middleware around your app object.

if ( app.debug ):
    from werkzeug.debug import DebuggedApplication
    app.wsgi_app = DebuggedApplication( app.wsgi_app, True )

That’s it. Simple as that! Debug your Flask application from the browser, without using the built-in development server. Don’t forget to switch off debugging in production, as the console offers arbitrary code execution.

For everything else there’s Winpdb.



Setting up Flask with nginx

I’ve decided to implement one of my next projects in Python. I picked Flask as my HTTP framework for its lightweight and unbinding design. It pretty much allows you to do everything at a low level, should you want to. And I do; I always prefer a low-level approach, without the bulkiness, APIs, configuration files, etc.

In any case, since my webserver of choice has long been nginx (built-in servers don’t impress me too much), having it serve Flask applications in a robust, reliable way was required. The instructions to marry nginx and Flask via uWSGI are quite clear. I compiled the uwsgi application container, read the docs and came up with the following startup command:

sudo uwsgi -s /tmp/uwsgi.application.sock --chdir /path/to/application -w application:app --uid "www-data" --gid "www-data" --touch-reload . --daemonize /var/log/uwsgi.log

The above is a development setting and will probably be very different in production. Since Flask doesn’t force any convention upon you, I picked the following project layout for now:

.
|-- application
|   |-- application.py
|   |-- models
|   |-- routes.py
|   `-- views
`-- static
    |-- favicon.ico
    |-- js
    |   `-- script.js
    `-- robots.txt

All Python code is in the application directory, where uwsgi runs it. Static files are a directory above that, and will be served as from there and not pollute the source tree. I could have kept the static directory under application just as well, but I’ll keep it outside for now. Here’s what I came up for my

server {

    listen 80;
    server_name application.lo www.application.lo;

    root /path/to/application.lo/application;

    try_files /../static/$uri @application;

    location @application {
        include uwsgi_params;
        uwsgi_pass unix:/tmp/uwsgi.application.sock;
    }

}

This is quite suitable for now. Any recommendations?