NodeJS Express Body-Parser Post Data Missing Problem

I am amazed that NodeJS' main network server package, Express, does not handle Post data on its own. I just don't get it. It requires a package called Body-Parser.

I copied in the sample code from the Express website

got the required packages and made a little form to test it.

It did not work.

The docs explain that Body-Parser builds a request.body that contains the post data but it was empty. It did exist, but it was empty.

I did one million things to make sure that I was doing what I thought I was doing. Postman? Check. Curl? Check. Inch by inch inspection of my entry page? Check.

I got to looking at the post on the way into the page in Firebug. I noticed that the encoding that Firefox was using was


The Body-Parser docs say that any of their decoders will take a type parameter to specify this. I found an example and tried it out:


Nope. I tried this in some decoder called raw(), the urlencoded() one, I even put in into json() just in case. Nada.

At my wits end, I'm just trying things in Postman. I tell it to encode it in various ways and, Voila!!, when I choose


It works.

WTF? I think. Everything specifically tells me that Body-Parser SPECIFICALLY DOES NOT DO MULTIPART form data.

How much clearer could it be?

Then I realize, the sample code from the Express docs caused me to install (last night when this nightmare began) something called Multer. Experimentation tells me that this is the reason I can do multipart/form-data. Without Multer, I get nothing.

But, without Multer, I get nothing no matter what I do. I did everything I have tried before and could not get Body-Parser to work. With Multer, only multipart/form-data. Without, nothing.

If anyone can enter a comment telling me how I got this wrong, I would be grateful.

UPDATE: It was none of the above!!!

It turns out that, in the course of the above screwing around, I moved the assignment of the router to follow the assignment of the Body_Parser. It executes the router before the parser if you tell it to. I had inserted the new body-parser code after the route. No reason, it just happened .

app.use(bodyParser.urlencoded({ extended: true }))

must precede

app.use('/', router);

“This project is incompatible with the current version of Visual Studio”

Nice error message Microsoft. Why not just say, "Screw off. We don't care about your old projects." This is why I avoid them whenever possible.

However, in this case, it wasn't possible. It was some reference code for the protocol of a new thing I'm working on. It was done in Visual Studio 2013. I'm using Visual Studio 2015.

After doing a bunch of stuff that was totally useless (I am not very strong in .NET), I happened up on this simple solution...

Open the .csproj file in a text editor.






Or, if you need more detail:


<Project ToolsVersion="12.0" DefaultTargets="Build" xmlns="">


<Project ToolsVersion="14.0" DefaultTargets="Build" xmlns="">

After that it opened with no problem and I was able to build and use the project.

I have no idea what the ramifications of this could be in the long run. If I find any that are adverse, I will write it down here.

mysql: Load Data From and Select ... Into Outfile

Just so I don't forget...

I did a test of this.

select firstName, lastName from users into outfile 'tempTest9999'

Then, after creating an appropriate table...

load data infile 'tempTest9999' into table tempTest

The thing I don't want to forget is that the file was put into the directory...


The owner and group were both 'mysql'.

I know, it's what one would expect, but I forget these things.

Responsive Image Maps - Generator and jQuery Plugin

Bottom line, I need to do an image map in a responsive website.

First, I found this lovely image map generator. It is simple and works very well. It's good enough that I actually gave him a few bucks.

The resulting <map> worked right away.

Then I started working on the real website. The image loads in a size relative to the browser window's size. The image map was out of alignment. I quickly realized the problem and googled Responsive Image Maps.

I found this:

I groaned. I hate adding dependencies and even more, I hate figuring out how to work new stuff that I'm probably not going to use again (this is the first image map I've used in years and once this project ends, probably the last for a long time). But, I did it.

The learning curve was ZERO.

I added the plugin. Copied his sample initialization and it worked perfectly the first time.


Tape Suppresses console.log() and Makes Debugging Difficult

I was seduced, eg [1], by the fact that the Tape (the "tap-producing test harness for node and browsers") unit testing tool does less. Not only does it not litter your Javascript universe with globals, it does a lot less magic stuff.

I am in the early stages of a new project and decided to do the right thing and test everything from the first moment. Having been irritated by the amount of arcane stuff in Mocha, I drank the koolaid and rewrote my starter testing for Tape.

Bad move.

Tape is simple to use but, when I started doing real development and a new test didn't run, I needed to debug and couldn't.

Turns out that Tape suppresses console.log(), process.exit(), etc. Of course, there are other ways to debug Javascript, but I am a fan of print-trace, ie, console.log() and you cannot use console.log() with Tape.

Searching the web, I found that this is not something that is not something that is noted very often. I don't know why. If I had known this, I would not use Tape. In fact, I am going to revert to using Mocha. The elimination of testing globals is not sufficiently compelling to make it worth changing my approach to debugging.

[1] Why I use Tape Instead of Mocha & So Should You -

Remote Volume Sharing with Ubuntu and fstab

I have two applications that run on my Ubuntu server. They both deposit their output onto the different  directories on the same remote volume. To avoid confusing my tiny brain, I like to isolate separate applications into separate users on my server. (It allows me to login and have the environment initialized with appropriate, different, management tools.)

So, I set about making it so that the volume mounted all the time. (That they are Windows volumes has, I believe, no bearing on this subject.) That is, the /ets/fstabs file contains a couple of lines like this:

//$ /home/appUserA/volumeName cifs uid=appUseruserA,rw,user,username=remoteName,password=******* 0 0

//$ /home/appUseruserA/volumeName cifs uid=appUseruserB,rw,user,username=remoteName,password=******* 0 0

The important point is that the IP addresses (here shown with zeroes) are the same and volumes (c$) match. 

The mounting worked. I could see the volume in both user directories. I got the application for appUserA to work. Life is good. But, when I got to the other application for appUserB, I could not write to its destination directory. 

After screwing around a long time, I realized that Ubuntu locks the mounted directory for the user that touches it first. This, before I figured it out, made for confusing results. Sometimes A could write. Other times B. In any case, I could always write with sudo.

I will have to do a different way of making these directories accessible (probably a common mount point with aliases into the appropriate directories).

So will you.


I am using a lot of Javascript plugins these days and managing them has become a fairly big deal. Recently, I have adopted Asynchronous Module Definition. I am using Require JS ( and like it a lot.

The problem is, Require JS does not work with CSS and it turns out that loading CSS is as much of a hassle as loading Javascript.

I think that the basic solution is to put a link to the CSS file into their page. But, I work with a framework and I do not want to pollute it with content-specific code. Some of the pages/sites that use the framework do not need plugin X and should not be having the CSS for it any more than they should have the JS.

Require JS says that they don't support CSS files because there isn't any good way to know when they arrive. It offers a tidbit of code for those, they say, who don't care about the timing of the arrival of the file.

This is nonsense. Following is code that does exactly that. Unfortunately, since it's not built into Require JS, I have to wrap stuff up for callback, but, c'est la vie.

var loadCssFile = function(filePath, callback){

  var link = document.createElement("link");
  link.type = "text/css";
  link.rel = "stylesheet";
  link.href = filePath;

  var count=0,
    for (var i=0, len=document.styleSheets.length; i<len; i++){
      var element=document.styleSheets[i];
      if (element.href && element.href.match(filePath)){
        if (count<10){
          setTimeout(tryAgain, 100);
          throw "loadCssFile() failed on to find "+filePath;

  setTimeout(tryAgain, 100);


PHP Pretty Print with bbedit

PHP is old and ubiquitous. It is completely bizarre that there are almost no options to format your PHP. Javascript has eighteen formatters and evaluaters and lint-ers. PHP is really lame by comparison.

In fact, other than online versions, I only found one. (Your comments will be appreciated!) is the only one I could find that I can run on my own computer. It is a pear component. I hate pear. It has all the worst aspects of php - inconsistent calls, structures and everything else.

But, I'm desperate. After a year of primarily NodeJS, I'm returning to PHP having developed a strong reliance on being able to clean up my code at the press of a button. Further, the project I'm returning to did not benefit from that technology so it's got a million lines of code that is only as pretty as I had energy that particular day. Since I'm also trying to read an old project, this is an urgent need.

Since I have hated pear for much longer than I've had my current computer, I had to install it. That was infinitely more hassle than it should have been (I know there are people who dislike NPM but it works infinitely better than anything PHP, including composer). But, done.

After that, installing php_formatter was no big deal. It works, too. Except...

1) Php_formatter actually sets the PHP error_level explictly. The PHP authors, in another example of what's bad about PHP, has changed the meaning of E_ALL. Now it includes E_STRICT. This is really stupid.

2) Php_formatter strips all empty lines from the code. I use empty lines. A lot.

So, here's the thing. I had to edit the php_formatter files to comment out their error specification. Amateurs. It was two files. I don't remember which so do a multi-file search.

I use bbedit by Bare Bones Software (a great program). I want this to operate as a Text Filter. That's not so hard. bbedit passes the contents of the window (or selection) to the program and, bingo!, you are in business. In fact, php_formatter worked correctly once I got it working at all.

The problem is the stripping of blank lines. Turns out that php_formatter supposedly has a filter that preserves them but I could not make that work.

I made a NodeJS program that turns blank lines into comments. Then I beautify. Then I remove the comment characters. Voila! Blank lines are retained.

For the record, here's the Text Filter:


~/Scripts/bin/js/commentUnComment.js -c | php  /Users/tqwhite/Documents/webdev/pear/bin/php_beautifier -t | ~/Scripts/bin/js/commentUnComment.js -u

And here's commentUncomment.js:

[NOTE: This is updated as of 5/29/15. The second regex failed in some circumstances.]


var inString='';

var cmdLineParameters={};
for (var i = 0, len = process.argv.length; i < len; i++) {
            var element = process.argv[i];
            if (element=='-c'){
            if (element=='-u'){
var writeStuff = function() {
        var outString='';
        if (cmdLineParameters.c){
        outString=inString.replace(/\n\s*\n/gm, '\r//\r');
        if (cmdLineParameters.u){
        outString=inString.replace(/^\s*\/\/\s*$/gi, '');
        outString=inString.replace(/^\t*\/\/\s*$/gmi, '');


//the rest ========================================================

process.stdin.on('data', function(data){
process.stdin.on('end', writeStuff);

Apple OSX Mail Program Changes the Port on Its Own

I use Mandrill for my outbound SMTP. For SSL, it operates through port 587.

I have had many occasions where my connection fails for no apparent reason. I'm just living my life and sending email and then, poof!, an email tells me it can't send with that server. I go through debugging and re-enter the credentials and it would work.

I started thinking "last time it happened, wasn't it the port number that was different then, too?" So, I changed the name of the server to "Mandrill, s/b port 587" so I would know.

It happened again today. I went to look at it the settings and there was a change to port 486. WTF?

I changed it back to 587 and it worked.

I have changed the setting "Automatically detect and maintain account settings" to be off. I'm going to be that stops the problem. If not, I'll update here. If it's been a long time since 5/15/15, you are safe in seeing that as a solution.