PHP DateTime ISO 8601 is a liar says about DateTime:ISO8601:

Note: This format is not compatible with ISO-8601, but is left this way for backward compatibility reasons. Use DateTime::ATOM orDATE_ATOM for compatibility with ISO-8601 instead.

Screen Shot 2016-07-28 at 11.32.18 AM

Ha! Why did you call it ISO-8601 then!

WP REST API – Hiding Endpoints

As of WordPress 4.4 the first part of the WP-API project has been integrated into WP Core, specifically in /wp-includes/rest-api. There are a few filters and tricks that are not brought to light in the WordPress 4.4 Release Notes

Disable default routes to standard WP Objects/Information

 * WordPress 4.4/4.5/4.6 Addition of the new REST API
 * Disable default routes (not our custom ones)
remove_action( 'rest_api_init', 'create_initial_rest_routes', 0 );

Customizing {your-site}/wp-json

Controlling Endpoints Shown

 * WordPress 4.4/4.5/4.6 Addition of the new REST API
 * Hide these from being visible on /wp-json
add_filter( 'rest_route_data', function ($routes) {
 $hiddenRoutes = [
 foreach($routes as $key => $route) {
 if(in_array($key, $hiddenRoutes)){
 return $routes;

Customizing the descriptions of your endpoints

 * WordPress 4.4/4.5/4.6 Addition of the new REST API
add_filter( 'rest_endpoints_description', function ($data) {
 return $data;


Salesforce Idiosyncrasies

When deserializing JSON objects passed into a REST end-point, Salesforce deserializes while keeping the case sensitivity of the fields’ names.

If you take those fields, and compare them to the fields returned by something like Schema.SObjectType.Contact.fields.getMap(), without manually lower-casing the field names, they’ll never matchup.


This seems like odd and inconsistent behavior but as long as you’re aware of it, it’s easy to correct for.

Add WP Theme file with limited access

Recently I found myself with limited access to a client site. All I had was wp-admin. No server or database or anything.

A neat trick to add a theme file, use in the header.php, hit a page on the site, remove it. And you now have a new template file you can edit on the client’s site.

No remote access required.

It does show you how vulnerable your whole site is if someone gets logged in with admin privileges.


Auto View Production logs

There is a little more setup than just running this command but it’s tremendously more expedient than what I was doing before.


gnome-terminal –window-with-profile={Name of your gnome user profile} -e ‘ssh root@{server1name} tail -c 25000 -f /tmp/php_errors.log’ && gnome-terminal –window-with-profile=AutoRun -e ‘ssh root@{server1name} tail -c 25000 -f /tmp/php_errors.log’


What I do at CURE

Many people ask me questions like, ‘What’s your title?’, ‘What’s your job?’, ‘What do you do?’. Few of them will question my dedication, hard work ethic, or the demeanor in which I carry out the work. None of the answers I give them really explain what I do, or have done, or will do. All of the answers I give are placeholders. I thought, for my own benefit, and perhaps others, it would be good to put it down in writing.

So much of what I do revolves around data for the organization, reading just a few of the below will quickly justify my title of ‘Lead Data Developer’, but I’m not sure it tells the whole story.

Coordinate the global implementation of Google Apps for several hundred users. Including migrating existing email accounts, training local IT professionals, mass account creation, setting up MX records on domain names, etc.

Self-taught expert on an archaic donor management platform (Blackbaud Raiser’s Edge), that I thankfully don’t have to interact with any longer. Delving deep enough to understand and write SQL queries directly against the database. Mastering and understand the limitations of their own query engine, modifying and configuring many aspects of the system.

ETL (Extract-Transform-Load) scripts, and processes for above mentioned archaic donor platform, as well as several other systems. I’ve written many Python scripts, to take data out of one system (Extract), modify the format and structure of said data (Transform), and then imported it into the system (Load).

Man-in-the-middle ‘systems’, to bridge information between systems. Creating reporting wrappers around gateway processing systems, building middle-ware applications to manage the flow of data from one system into another without problems. Specifically, the best example, is taking general user and payment information (not CC numbers), saving and transporting the data off to another internal system.

Full cycle, requirements gathering, data migration, implementation, configuration, and customization (code) of a cloud based CRM (Salesforce) as a Donor management system. Including a lot of code around donation processing, user and campaign creation, reporting.

Building and customizing WordPress Plugins. The simplest of which are wrappers around a single post-type to display Staff Pages. Middle-ware plugins to store donation transaction information. The more complex being a plugin that is a software layer that generates materialized database views using denormalized WordPress data (some object and it’s object_meta table). As an extension to that, built a basic report generator, using those views.

Full-featured and robust local web application, written in ExtJS, with many data collection forms. For use in seldom-connected to internet environments. Capable of storing and managing data locally, and pushing that up to a server when there’s an internet connection. I had some architecture help from my boss with this (he wrote almost all of the syncing code, both front-end and server-side). This was the project that I cut my teeth on Javascript with.

Web-based access for donor giving information with ability to generate PDF’s of giving history, using information directly from the donor management system.

Customized Google Maps using version 3 of their API for constituent data, used for finding central locations for donor events. Display maps of donor locations, branded maps showing where we have a presence in the world.

Light-weight Javascript application for a more presentation like interaction with our CUREkids system.

I’ll add more as I remember, too many projects over the years.

Blogging is hard

Blogging is hard.

Writing is not hard. Anyone can write something and post it on the web. Writing is not blogging.

Blogging is adding something useful to the webiverse, that other people will read, and appreciate. Blogging is not your raw journal you keep by your bed, or sketch notebook you keep with you at work.

Blogging, with purpose, is to take content and experiences from your everyday life and formulate it into a concise but informative article that will benefit others.


Important lessons in configuration

Mac OS X – Nginx + PHP-FPM + MySQL + Node

Don’t install node packages before reading/understanding what they do. It has unbelievable potential for frustration.

I installed a package called local-tld which is a node package designed to help you manage your hosts file. Then I proceeded to setup Nginx/Fpm/Mysql and completely forgot about it. local-tld, forwards everything coming in on port 80 to itself on port 5999, but when you run the command line to figure it out:

sudo ipfw list – this tells you something is forwarding all port 80 traffic to port 5999

lsof -n -i4TCP:5999 | grep LISTEN – this tells you it’s node, but doesn’t tell you what package it is

Meanwhile, before you get to this point, you’re bashing your head against things screaming ‘Why does it work on port 81 but not on port 80?’

Also, if you’re running Nginx + Fpm, this line is really important, either in your fastcgi_params file, or in the virtual hosts server config somewhere:

fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;

It’s not there by default, but it tells it to pass the php script file location onto the cgi process that has been spawned via PHP-FPM so that FPM knows what file to run. Without, you just get blank white screens. And if you add that line incorrectly, you get a PHP error of ‘File not found’.

Many lessons learned tonight about configuring a dev environment on Mac OS X Mavericks, via Homebrew.

A special shoutout and thanks to this post for all the great information and configs:

Oh and one more, when installing CouchDB via Homebrew, SonicWall blocks the spidermonkey pre-req file from downloading from the mirror, as it matches some kind of Javascript virus signature. Yet another needle in a haystack of dev environment setups.