tag:tech.genericwhite.com,2013:/posts I Love Javascript! 2018-04-18T15:59:39Z TQ White II tag:tech.genericwhite.com,2013:Post/1274231 2018-04-18T15:59:39Z 2018-04-18T15:59:39Z SSH issues with Mac OS X High Sierra

macOS sftp "no matching cipher found"

Add this to ~/.ssh/config

Host * SendEnv LANG LC_* Ciphers +aes256-cbc

Works like a charm.

Thanks to Jason.

read more: SSH issues with Mac OS X High Sierra

Wed Apr 18 2018 10:57:46 GMT-0500 (CDT)
]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/1224850 2018-01-02T00:25:07Z 2018-01-02T00:31:55Z Public Key Encryption Playground

My interest in public key encryption continues. I wanted to be able to actually use a tool to play  so I wrote one.

It does these things:

1) Generate a key pair.
2) Extract a public key from a private key.
3) Manually enter public or private key.
4) Create a crypto text string from plain text input.
5) Extract plain text from a crypto string.


You can play with this at: http://genericwhite.com/rsaEncryptionDemo/

Here's a brief video to get started...



Code is available on github.
]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/1224307 2017-12-31T07:26:11Z 2018-01-02T17:28:22Z Public Key Encryption for NodeJS with node-rsa
When I was trying to make node-rsa work, I felt that the instructions were a little bit
cryptic. It took way too much time for figure out the hyphenated argument structure,
ie, pkcs1-public.

Also, I'm not a huge expert in encryption stuff so it took way too long to figure out that
the key produced by ssh-keygen was wrong and what to do about fixing it.

I decided that the things I learned need to be documented for posterity.

So, when I got it working, I tuned this up for readability and put it in a repo so
that you can find it. It does three things.

1) Encrypt with public key/decrypt with private key, both from files
2) Encrypt with private key/decrypt with private key, both from files
3) Generate keys to use for decryption and print them out

Change the variable testName to try them out.

Just navigate to the directory and run the file:

node testNodeRsa.js

Bonus! For your convenience, here is the command to convert the .pub generated by ssh-keygen into a .pem:

ssh-keygen -f keyName.pub -e -m pem > keyName.pem

You're welcome.

CHAPTER TWO

I want to be able to use the keys in a browser. I figured these learnings were worth documenting, too.

I used Browserify.

browserify testNodeRsaBrowser.js -o testNodeRsaBrowserBrowserify.js

If you put this repository someplace you can serve html, it will let you play with it.

You can also play with it at: http://genericwhite.com/rsaEncryptionDemo/testNodeRsaBrowser.html

CHAPTER THREE

My interest in public key encryption continues. I wanted to be able to actually use a tool to play so I wrote one.

It does these things:

1) Generate a key pair.
2) Extract a public key from a private key.
3) Manually enter public or private key.
4) Create a crypto text string from plain text input.
5) Extract plain text from a crypto string.


You can play with this at: http://genericwhite.com/rsaEncryptionDemo/

The code is available on github.
]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/1220851 2017-12-23T16:47:19Z 2017-12-23T16:47:19Z Javascript Alert Debug: Coolest thing I ever did

Here's the problem. A co-worker got a bizarre alert() dialog in a web app. It popped up and just said "1". It wasn't her code and she was completely stumped. That's how I got involved.

I looked around the code. Searched for errant /alert\(/, etc. Nothing worked.

Then I hit the developer console. I typed the best single thing I ever typed:

window.alert= msg=>{ console.log(msg); console.trace(); }

Yeh, I did that. You may enjoy my awesomeness.

Console.trace() gives a stack dump at the location that the alert dialog was called. That was very helpful.

You are welcome.

]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/1215669 2017-12-12T19:52:42Z 2017-12-12T19:54:15Z SIF v3.5 Adds Support for Individualized Education Plans

Access for Learning (a4l.org) is happy to announce a major new addition to the School Interoperability Framework (SIF) data model to support students with special needs. Comprising two major components, xIndividualizedEducationPlan and xIepTransfer, this release is the result of a two year effort led by TQ White II of the Central Minnesota Educational Research and Development Corporation (cmerdc.org) with the help of national experts in special education and data modeling. The effort is motivated by the recognized need to make a student’s individualized education plan (IEP) content available when a student transfers into a new school.

The new data models are intended to support three main use cases. 1) Immediate support for an administrator the very first time a student shows up in a new school. 2) Information to support the special education team as they adapt plans already in place to the resources and strategies of the receiving school. 3) Sufficient information for schools and districts to support reporting and resource management needs. The goal is to ensure that a school has the information needed to provide students having special needs with critical, ongoing services.

This new model is based on a thorough survey of the standard form sets published by nearly every state, as well as the federal government. They were categorized into representative groups for an exhaustive inventory of data and evaluation of documentation strategies. With the input of a workgroup averaging about ten people a structured hierarchy of elements was developed and refined. Once done, the work was passed to Jill Parkes, education data analyst, at CEDS (Common Educational Data Standards) a federal organization that develops a dictionary of education related data definitions.

The CEDS process did two things. First, they evaluated each element in the new, tentative IEP data model and, where appropriate, attached a formal definition to it, either new or in reference to an existing definition. Then it was put into a formal CEDS community review. CEDS stakeholders, especially those with an interest in special education, reviewed the new definitions and approved them. This discussion improved confidence in the data design and made it more complete.

After this information was added to the XML – and with substantially more confidence, the data model was formally moved into the Access for Learning community review process. Though some people looked at the XML and offered comments, the main process involved TQ making presentations to various groups explaining the process and product in detail. Many valuable comments were made that resulted in changes but two were especially valuable.

First was Megan Gangl, a co-worker of TQ’s at cmERDC. Megan has spent her entire career as special education teacher, case manager and leader of case managers. Her decades of experience brought many new details to the model, suggested reorganization of some parts and validated others. She identified missing details, helped to rename elements and refine both their data definitions and the explanations of their meanings. After the initial presentation, she spent several days collaborating on the model in detail. Once done, confidence in the usefulness, correctness and completeness of the model was again tremendously improved.

The day before community review started in October, a new person, Danielle Norton, joined the North American Technical Board. Danielle’s team contributed to the community review with sessions including the detailed overview presentation and discussions with various subgroups of her team. A particularly important contribution was made by Rick Shafer, a long experienced data architect, who noted some problems with normalization in the data model.

The initial motivation for the IEP effort was to support the transfer of students between schools or districts. Throughout the process, the foremost intention was to provide complete information for the receiving educational agency. As a consequence, the data model included data elements that were duplicates of things that were defined elsewhere in SIF. That is, it was badly de-normalized. It made it so that the element would provide a complete picture for a receiving district but was ill-suited for use as a local SIF entity object.

To solve this problem, the data model was split into two elements, xIndividualizedEducationPlan and xTransferIep. The former is completely normalized to serve as a formal entity. No data is represented that is defined elsewhere in SIF but is, instead, referenced with a refId. If a receiving program needs to know those details, it is expected to query the appropriate system for details.

The latter is conceived as a reporting object, i.e., it is intended to wrap information that is defined elsewhere for convenient reference. The xTransferIep includes structures that allow it to contain data referenced in the IEP that would otherwise require a query to a system to which the receiving organization may not have access. The xTransferIep is a complete representation of an IEP containing all details.

In this process, a new concept was added to SIF, the typed refId. Troubled by the fact that refIds inside the IEP provided no information about where the target information referenced by the refId could be found, TQ added several new data types to the data model. Each is a UUID (as is the generic refId) but each also included documentation elements that explain what the UUID refers to and where the data can be found. For example, one of the new types, iepCommonStudentContactRefIdPointerType, explains that it references a contact inside a student object, distinct from iepCommonContactRefIdPointerType, which points to an independent xContactType, e.g., service provider or doctor, somewhere else.

The last thing is that, with the help of Access for Learning’s John Lovell, the new data models were refined to fit the new xPress object strategy. It does not use XML attributes and refIds are only present for elements that need it. This allows easier use of the model in non-Java/.NET systems. xPress is a more recent addition to SIF v3 and has proven to be easier to work with and, consequently, more popular. It is expected that xPress will be the foundation of new infrastructure work to formally bring JSON into the data model.

As with any first effort, it is fully understood by TQ and the entire community that as this data model comes into actual use, shortcomings will be noted and ideas will be conceived. It is intended that the SpecEd/IEP workgroup will reconvene in the future to evaluate the results of implementation. That is to say that, as with the rest of SIF, the new IEP data models being released with SIF v3.5 are not the end of the effort to better support students with special needs. This release is the beginning of an ongoing effort to insure that SIF is able to help schools, districts and teachers have the information needed to support optimal educational outcomes and to allow students with special needs to have the brightest possible future.

For even more information, a video recording of the IEP Data Model Overview is available HERE. To contact TQ White II, leave word in the comments.

]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/1172543 2017-07-11T19:01:36Z 2017-11-21T06:57:42Z mysql reset root password with fix for Socket problem

I don't reset mysql's root password often enough to remember how to do it. So, I google and go through endless hassle because all the examples are old or incomplete. It's maddening.

The main problem is that nobody includes the stuff below referring to /var/run/mysqld. I don't know why. Perhaps it was not needed in the past. However, it sure is now. You can tell if you do by seeing this:

mysqld_safe Directory '/var/run/mysqld' for UNIX socket file don't exists (2)

when you try 'mysql -uroot mysql' without it.

The sequence below works on my Ubuntu 16 installation. 100%. I did it a few times because I wanted to make sure I had done it correctly and repeatedly.

#mysql: reset root user password bash commands
sudo service mysql stop
sudo mkdir /var/run/mysqld
sudo chown mysql: /var/run/mysqld
sudo mysqld_safe --skip-grant-tables --skip-networking &
mysql -uroot mysql

#in msyql:
UPDATE mysql.user
SET
  authentication_string=PASSWORD('PUT_NEW_PASSWORD_HERE'),
  plugin='mysql_native_password'
WHERE User='root' AND Host='localhost';
exit;

#and back in bash
sudo mysqladmin -S /var/run/mysqld/mysqld.sock shutdown
sudo service mysql start

mysql -uroot -pPUT_NEW_PASSWORD_HERE

(Of course, mysql will beef that you put your password in the command line. Don't do it if your bash_history or logs could be accessed.)


By the way, I got this information from this website. Obviously this person is a genius. Props.

https://coderwall.com/p/j9btlg/reset-the-mysql-5-7-root-password-in-ubuntu-16-04-lts







]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/1172216 2017-07-10T17:58:15Z 2017-07-10T17:58:15Z SIF JSON Response

So, I saw the PDF John  posted discussing JSON format ideas. Ian and Jon, you rock. It is a great document and excellent ideas. Most of it makes good sense to me and, I'm sure that, as I reread and understand better, I will love even more.

That said, there is one fundamental detail I do not love:

    authors: {
        '#contains': 'author',
        '#items': [{ '#value': 'John Smith' }, { '#value': 'Dave Jones' }]
    }

First reason is that the label 'authors' is plural but contains only one thing. In my opinion, things named plural should always be arrays. Second is that I envision a line of code like:

    const firstAuthor = inData.authors["#item"][0]["#value"];

Looks a lot like C# to me, low signal to noise ratio.

In my other Javascript life, we would be inclined to use inflection, i.e., the assumption that 'authors' has elements with an implied name of 'author' and vice versa. I can understand that our XML roots make this difficult to accept.

Consequently, I am inclined toward the everything is an object (if it's not a list) approach. EG,

    authors: [
        { author: 'John Smith', '@type': 'bigshot' },
        { author: 'Dave Jones', '@type': 'contributor' }
    ]

This provides a data structure that mentions the word 'author' the same number of times as does the XML. That it also provides room for attributes is good. This seems nicer to me:

    const firstAuthor = inData.authors[0].author;

I don't know if this can be expressed properly with openAPI or if it violates some other rule of interaction with XML. I do know that, as a Javascript programmer, I would rather use the form I suggest.



]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/1152540 2017-05-08T17:36:05Z 2017-05-08T17:45:57Z Comment on Net Neutrality.

Trump's FCC is about to permanently turn the internet over to the corporations. In a few years, your ISP will be like a cable provider. You can only access the sites that make them money. Other sites will be very slow or non-existent. You will pay for packages. Package A: YouTube, Netflix. Package B: Netflix and Hulu. You want to start an internet business, you will have to work a deal with every ISP corporation. It will be bad.

Go to GoFccYourself.com to be forwarded to the correct page for your comment. Do it every day.

When you click on GoFccYourself.com, you will end up at a page that looks like this: 

Click on the 'Express' link. It will take you to the entry form.

Suggested text:

Net neutrality is essential for freedom. Net neutrality requires Title II regulation of internet service providers. ISPs should completely prevented from influencing the cost or performance of the internet resources and websites I want to use. My bandwidth purchase from my ISP and the site's bandwidth purchase from their ISP should be the only charges.

Mon May 08 2017 12:35:18 GMT-0500 (CDT)
]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/1132846 2017-04-19T18:54:08Z 2017-04-19T18:54:08Z Macbook Display Port VGA Adapter Doesn't Work

Google as I might, nobody would tell me that the adapter had firmware that could be out of date. Eventually, I found a reference to the idea in the form of an updater that would not work.

The important thing is that I realized that if you cannot use your Macbook for presentations on a VGA projector or display, it might be that you simply need to buy a new one.

I did and now it works.

]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/1147610 2017-04-19T18:53:09Z 2017-04-19T18:53:09Z MacOS/OS X Dock presentation formatting with Spacers!!
I have a lot of stuff in my dock. I have long wished I could have some sort of grouping mechanism so it was easier to find what I want. Today I learned that you can add spaces to your Dock. Life is good.

Enter:

defaults write com.apple.dock persistent-apps -array-add '{"tile-type"="spacer-tile";}’; 
killall Dock;

In your Dock, you will see a space that you can drag as you see fit. You can repeat the above as many times as you want. If you want to remove the space, just drag it out like anything else.

I added some to separate my email and web browser from my development tools and those from the rest of the stuff.

]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/1127342 2017-01-30T20:17:49Z 2017-01-30T20:21:23Z Using 'prettier', a Javascript formatter in bbedit

You have to have NodeJS. If you don't have it, google it and make it happen.

Then you need to install prettier. It's NPM page is here.

To install it, type...

npm install prettier -g

This will install it in a command line utility.

In some file (I do a lot of this, so I created a Scripts folder and called the file, ~/Scripts/bin/js/runPrettier.js, you can do what works for you just remember that the bash file below has to point to the file), insert this:

#!/usr/local/bin/node
const prettier=require('prettier');
var inString='';
var writeStuff = function() {
        var outString='';
outString=inString;
        outString=outString.replace(/^[\s]$/gm, "/*linebreak*/"); //I like to retain linebreaks
        outString=prettier.format(outString)
        outString=outString.replace(/\/\*linebreak\*\//gm, ""); //you can remove these if you don't
        process.stdout.write(outString);
    };
//the rest ========================================================
process.stdin.resume();
process.stdin.setEncoding('utf8');
process.stdin.on('data', function(data){
        inString+=data;
    });
process.stdin.on('end', writeStuff);

Then in a file in the bbedit directory (~/Library/Application Support/BBEdit/Text Filters)

Create a file (I called this 'runPrettier') containing this...

#!/bin/bash
~/Scripts/bin/js/runPrettier.js

In the terminal make the bash script executable...

chmod +x ~/Library/Application Support/BBEdit/Text Filters/runPrettier

and, voila!, you have an operating formatter for Javascript.

I assigned mine to a command key so I can always make it pretty.


]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/1123974 2017-01-17T22:06:54Z 2017-01-17T22:06:54Z NGINX server_name is not working, ignoring config and getting the wrong server including SSL When NGINX is trying to find something to serve it tries to match all the server names BUT ONLY IF THERE IS A DEFAULT SITE.

I don't understand why it fails even when the server name matches something. However, if you have two separate servers with:

server_name xxx.com

server_name yyy.com

You would expect that (assuming that the configs appear in this order) that http://yyy.com would match that server_name. It will not. It will match xxx.com. Why? Because when there is not default, it simply uses the first server. Period.

If you have a default, though...

server_name xxx.com

server_name yyy.com

default_server

It works. yyy.com will match xxx.com.

I come upon this problem because I had a configuration that had the default file that comes in the distribution and it worked.

Then I added SSL. It did not work. Having long forgotten the issue with default, I debugged like a madman. Then I thought about the default issue (I ran into it sometime in the dark past - it is buried in the docs) and saw, There's a default right there!!!

Eventually (I know. This is the least entertaining punchline in history.), I realized that there was no default for port 443. QED


# Default server configuration
#
server {
    listen 80 default_server;
    listen [::]:80 default_server;
    root /var/www/html;
    index index.html index.htm index.nginx-debian.html;
    server_name _;
    location / {
        try_files $uri $uri/ =404;
    }
}
server {
    listen 443 ssl default_server;
    listen [::]:443 ssl default_server;
    ssl on;
    ssl_certificate /etc/ssl/PATH/TO/CERT.cer;
    ssl_certificate_key /etc/ssl/PATH/TO/CERT.key;
    root /var/www/html;
    index index.html index.htm index.nginx-debian.html;
    server_name _;
    location / {
        try_files $uri $uri/ =404;
    }
}






]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/1118474 2016-12-27T05:39:02Z 2016-12-27T05:39:03Z Using iTerm2 for the bbedit 'Go here in terminal' command

Turns out that bbedit uses the standard system terminal. Someday I will figure out how to subvert that because I cannot imagine any reason that I would ever want to use Apple's dumb old terminal program when iTerm2 exists. In the meantime, I asked the lovely people at Bare Bones Software how to change bbedit's behavior.

Turns out it's right there in the Expert Preferences list (about a hundred obscure things that I never thought of as I searched the Preference preferences for some way to control this). To make it easier for future generations, I offer the complete command line:

    defaults write com.barebones.bbedit TerminalBundleID -string "com.googlecode.iterm2"

It works and make bbedit another fraction better.

]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/1111804 2016-11-29T23:05:24Z 2016-11-29T23:05:24Z nodemon does not watch node_modules I'm no fan of the node_modules structure. I'd have gone for a dichotomy, node_library and node_modules. node_library would be the place that npm and yarn install stuff from npmjs.org. node_modules would be for my modules and other code. I would have node search up the tree in both folders. Complicated, yes. But having no systematic way of determining which code is mine is simply awful.

In one of the goofiest decisions in the node world, the project monitor, nodemon, does not look inside node_modules to figure out whether to restart a project when files are changed. This is certainly because it has to monitor a zillion files if it does and the guy worries about performance.

The problem is that my projects are comprised of node modules and they are put in node_modules. If there is a better place to put them, I beg you, write some comments and save me and the rest of the misery.

So, if you

cannot get nodemon to restart your project or
nodemon won't detect changed files
nodemon will not watch node_modules
(put search phrases in the comments, please)

you can remedy the situation by adding an ignoreRoot key to you nodemon.json file
{
  "ignoreRoot": [".git", ".jpg", ".whatever"]
}


This overrides the default ignore behavior entirely. Choosing to not list node_modules means they can now be watched.

This is, in fact, explained on the github site (here) but, you have to read a lot of stuff to get to it and it doesn't get found by google.

Perhaps this will change that.

]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/1111707 2016-11-29T15:58:08Z 2016-11-29T15:58:08Z Scrolls Bars on Macintosh

Among the worst things that Apple ever did was make it so that scroll bars are only visible when you want to use them. Often it's hard to figure out how to activate them or they go away before I'm done using them.

Of course, Apple is actually awesome and, after all these years, I just realized that they provide make them show all the time. In the two days since I discovered this, I am happy again for the first time.

To accomplish this minor miracle...

System Preferences -> General -> Show Scroll Bars -> Always

It's like being able to breathe again.

]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/1093191 2016-09-25T05:46:35Z 2016-11-29T15:58:32Z Semi-colons should be considered mandatory in Javascript

As long as this

var obj={a:'a', b:'b'};
var a = obj
[a].forEach(()=>{})

produces an error, semi-colons are mandatory and any suggestion to the contrary is childish perversity.


]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/1004115 2016-02-29T22:24:21Z 2016-02-29T22:24:32Z rsync error: error in rsync protocol data stream (code 12)

The internet failed to tell me:

This error can result from not having one of the directories present. Yes, I know that rsync creates lots of directories very nicely. Not all of them.

To wit:

rsync someDirectory someUser@1.1.1.1:/home/someUser/system/code

Gave the the data stream error until I created the directory 'system'.

I can't imagine why. The only thing that is distinctive about 'system' is that it is also ~/system, ie, at the top of the user's directory.



]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/971765 2016-01-15T22:05:44Z 2016-01-15T22:05:44Z Windows is Bad, Example #billion

I often note that whenever Microsoft has a decision to make, they unerringly choose the worst alternative. Most of these are so small that I forget about them but today, I ran into one that is so perfect, I want to remember it forever.

As with my Macintosh, one way to rename a file is to click once directly on the text of the name. It will turn into a text entry and select the name, awaiting the new file name.

Occasionally, I want to append something to the name, eg, BACKUP, HOLD, TMP, DISCARD, etc. So that I can use the real file name for something new without having to lose access to the old file.

On my Macintosh, I open the name for editing and touch the right arrow. This moves the cursor to the end of the selection and lets me keep typing.

On my Windows machine, it moves it past the end of the selection and then past the period so that I am changing the file extension. While it's not impossible that this is what I want (on my Mac, select-all, right-arrow, delete delete delete), it's relatively rare.

Which is to say, Windows forces me to do extra work to accomplish the most common function. Like always. It also increases the likelihood of error.

How much does Windows suck? Just sayin'.

]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/970442 2016-01-14T00:43:18Z 2016-01-14T00:43:18Z Access Web.config from Classic ASP

I am in pain. The things I do for money. Visual Basic in asp. Kill me now!

I am handed this app with the URL of a JSON endpoint hard coded into it. That might not be horrible except that this is not a one-shot application. This is something we sell to a lot of people and the endpoint needs to point to the customer's domain or that nasty cross browser stuff will bite you.

After sneering about the laziness, I set about looking for the equivalent of getEnv() or process.env.varName or any of the sane ways that other language systems provide access to configuration data.

Nope. This is Microsoft after all. Not only can't I find said simple 'read the config' function, I can't really find a straight answer. (The fact that I didn't know to google "Classic ASP" didn't help. - Hey! Stop laughing. I still am not sure what the language is called. vbscript, visual basic, asp... WTF? I hate Microsoft apps. I'm only doing this because nobody else in my shop could figure out why it was broken.)

Of course, StackOverflow eventually came to the rescue, sort of. A nice person named Connor worked through this problem. It didn't work for me right away but, thanks. It saved my bacon. I present it here so I can find it again if I am ever again unable to escape a Classic ASP task, and so that it has more good google keywords. This is copied directly out of my app. It works.


'****************************** GetConfigValue *******************************
' Purpose:      Utility function to get value from a configuration file.
' Conditions:   CONFIG_FILE_PATH must be refer to a valid XML file
' Input:        sectionName - a section in the file, eg, appSettings
'               attrName - refers to the "key" attribute of an entry
' Output:       A string containing the value of the appropriate entry
'**********************************************************************************

CONFIG_FILE_PATH="Web.config" 'if no qualifier, refers to this directory. can point elsewhere.
Function GetConfigValue(sectionName, attrName)
    Dim oXML, oNode, oChild, oAttr, dsn
    Set oXML=Server.CreateObject("Microsoft.XMLDOM")
    oXML.Async = "false"
    oXML.Load(Server.MapPath(CONFIG_FILE_PATH))
    Set oNode = oXML.GetElementsByTagName(sectionName).Item(0)
    Set oChild = oNode.GetElementsByTagName("add")
    ' Get the first match
    For Each oAttr in oChild
        If  oAttr.getAttribute("key") = attrName then
            dsn = oAttr.getAttribute("value")
            GetConfigValue = dsn
            Exit Function
        End If
    Next
End Function

settingValue = GetConfigValue("appSettings", "someKeyName")
Response.Write(settingValue)


ps, here's Connor's post: http://stackoverflow.com/questions/28960446/having-classic-asp-read-in-a-key-from-appsettings-in-a-web-config-file/34779392#34779392



]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/962664 2016-01-03T15:59:51Z 2016-12-21T20:22:24Z why use "console.log.bind(console)"

Oddly, I never saw this construct much until recently (late 2015). I suppose that's because the problem it solves wasn't common before promises arrived on the scene and people wanted to pass console.log around for debugging purposes. When it did start showing up, I was perplexed. I am 100% comfortable with the details of the bind() function and why to use it. I could not, for the life of me, understand why it was needed here.

The answer:

It's not needed. I tried to get the behavior discussed below in a few contexts and it appears that console.log has been revised so that this isn't needed. Or, maybe it is. I didn't do a comprehensive check of the entire world.

Well, then, wtf?

Turns out that, in some previous world (perhaps some browsers today that I don't have the patience to experiment with), console.log() contains a reference to it's own this which is lost in a simple function assignment. IE...

If you, for some reason, want to pass the function console.log around, eg,

var xxx=console.log

xxx('message');

It won't work. The console.s log() method refers to this.something that isn't a part of it's new, post assignment 'this' (ie, the global scope, which is 'this' when nothing else is applicable).

That is, if you could magically add console.dir(this) to console.log, you would get something in this spirit...

{

log:function,

somethingElse:godKnowsWhat

}

But, if you were magically able to get that same console.dir(this) after the assignment to xxx, it would produce something like this...

{

//global scope items that do not include anything name somethingElse

}

The construct in question specifies the value of this for the newly assigned xxx

var xxx=console.log.bind(console);

That is, it will be bound to its original 'this', (ie, this===console) and it will be able to find whatever somethingElse it needs.

]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/961508 2016-01-01T19:11:26Z 2016-01-01T19:13:47Z Mongo Mongoose Objects Weird Properties Can't Access

Yep. I'm all about search engine titles this week.

In my new Mongo/mongoose world, I've exposed myself ot a ton of information, tutorials, explanations, examples, you name it and there is a HUGE thing that no one mentions.

Perhaps I'm an idiot or maybe the fact that I work entirely alone means that I am not privy to common knowledge so common that no one needs to say it but, I'm saying it now:

Mongoose does not create nice Javascript objects. When you do a find, you get something that will console.dir() and look like something you recognize. It is a false appearance. It is some nasty Mongo thing. Eg,...

I had an object that, viewed with my general purpose display program, looks like this:

Doesn't look like it should be a problem. Addressing it as...

var item=recipient[0][0];

recipient[0] gave me the first array element, an object.

recipient[0][0] attempted to to address the object containing the email address and came back undefined.

Nothing I could do gave me the email address. It was insane.

Turns out that I could google the answer pretty easily once I realized what was going on but only after tearing my hair out because no one mentioned that mongoose produces bizarre data structures.

The answer is (drum roll, please)...

.lean()

as in...

query.lean().exec(wrapCallback(callback)); //this produces a plain, Javascript object

as compared to

query.exec(wrapCallback(callback)); //this produces a mongoose object from hell

Obviously, if you are doing your processing in a model, you might really like the benefits of using the mongoose data type. I bet there are a TON of them but, if you are trying to construct a test, as I was, or send it to a web page, like I plan to do, you are going to want something that acts like a civilized citizen, not some rogue toxic waste factory.

But, I wouldn't have minded at all if somewhere in the mongoose quick start and documentation they had said, "The output of find() must be accessed with mongoose utilities unless you add the lean() function to convert it to plain Javascript."

Maybe you got to this note before you actually ruined yourself. Or, maybe you, like everyone else on earth, already know.


]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/960806 2015-12-31T14:06:03Z 2015-12-31T14:06:04Z Mongo Mongoose Simple Tutorial Quick Start Code Example

How's that for a search engine friendly title? It's a combination of the things I tried as I started my experience with Mongo/mongoose that did not give me what I wanted. What I did get were all sorts of useful explanations and advice. What I did not get is a simple bit of working code to satisfy my Hello World needs.

The code below includes the lesson, 'use createConnection(), not connect()'. The former allows you to access the database from separate model classes without having to pass a connection object around. In the depths of the mongoose forums, the fancy guys admit that they prefer createConnection() so it's ok.

It also has my curried default test callback. I had a problem when I tested the example that required it so I left it in so you can solve whatever screws it up for you more easily.

I had two problems you might want to know about:

You will notice that the boilerplate has a peculiar model name (userTqTest). I built this example inside my real, working project. It turns out that Mongo does not forget old model definition stuff. I haven't quite worked out the details but the bottom line is that when I declared this model as User and saved, I got errors based on the validation in my previous, real model. That is, the mongoose model does not supersede the one inside Mongo itself. I got over it by, first, dropping the collection. Then, since I don't want you to have to drop your User collection, I changed it to a name that is very unlikely to collide with you database.

Second is one that you probably will notice instantly but, because I was focused so much on showing the simple syntax, I didn't think about right away is that these calls to save, get and delete are asynchronous. They are triggered in the right order but they are happening in whatever order Mongo takes care of it.

I didn't want to complicate the example with async or some other method to sequence the calls, so I didn't. This code definitely works if you run the three functions one at a time but give unpredictable results otherwise. My problem was that the delete fired before the save was complete leaving the record to give me a duplicate key error on the second pass.

Enjoy.

========================================

        var mongoConnection = mongoose.createConnection("mongodb://localhost:27017/test");
        mongoConnection.on('error', function(err) {
            throw ("user code says, mongoose failed")
        });

        var userSchema = new Schema({
            userName: {
                type: String,
                unique: true
            } /*obviously you'd have more properties*/
        });

        var userAccessor = mongoConnection.model('UserTqTest', userSchema);

        var saveUser = function(user, callback) {
            /*you'd probably do some validation even though the mongoose will also validate
            you'd also wrap stuff in try/catch but that's not the point of this example.*/

            var newUser = new userAccessor(user);
            newUser.save(callback('saveUser' + '/' + user.userName));
        }

        var getUser = function(user, callback) {
            var query = userAccessor.findOne(user);
            query.exec(callback('getUser'));

        }

        var deleteUser = function(user, callback) {
            userAccessor.remove(user, callback('deleteUser'));
        }

        var callback = function(callingName) {
            return function(err, result) {
                console.log(callingName + '============');
                console.dir(err);
                console.dir(result);
                console.log('=====');
            }
        }

        saveUser({
            userName: 'tqwhite'
        }, callback);


        getUser({
            userName: 'tqwhite'
        }, callback);


        deleteUser({
            userName: 'tqwhite'
        }, callback);




   


]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/955248 2015-12-22T18:57:59Z 2015-12-29T03:28:33Z Running Multiple Versions of NodeJS on One Server

In the modern age, node installation is handled by the 'n', eg,

sudo n stable #the current stable version

or,

sudo n 0.10.40 # to install v0.10.40

Once you have installed a version, it never goes away. You can switch back and forth between versions instantly. That means that you can easily alternate between two (or more) versions if you want.

But, you might (as I did) have an app that must run on an old version but you otherwise want to use a modern version. That is, you want two different versions active at once.

This can be done by referring to the node binary with a fully qualified path. (Remember, when you type "node", your shell is simply giving you /usr/local/bin/node.)

The program 'n' allows you to find the path to the binary (assuming that you have previously installed it), ie,

n bin 0.10.40

->/usr/local/n/versions/node/0.10.40/bin/node

You can then type

/usr/local/n/versions/node/0.10.40/bin/node SOME_OLD_APP.js

into your shell or put it into your initialization script or whatever.

I create separate users for each of the apps I have running on my server. There is only one that requires the old version of node. I want to use the latest stable version for everything else so, I have done "n stable". For the user account where I run the old app, I change my .bash_alias file to include...

alias node="/usr/local/n/versions/node/0.10.40/bin/node"

Then I can type

node SOME_OLD_APP.js

and get good results, or,

node --version

->v0.10.40

But when I log in as the user for a different app, I get the latest stable version.

Obviously, you can do this for as many node versions as you want but, if you need a lot, you might want to think about why (for me, it is Meteor, which only runs on old node).


UPDATE:

I have had occasion since I wrote this to install a new server. I was reminded that the utility 'n' does not come automatically with a NodeJS install. After you install NodeJS (which also installs npm), type

sudo npm install n -g

and the you can verify it by typing

n --help

and you will get all kinds of good info.

]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/952026 2015-12-17T20:47:25Z 2015-12-17T20:53:49Z Color and Other Format for Fargo.io

Dave Winer is the guru of outlining. He is the person who moved outlines from English class onto the computer in the eighties. He's done a lot of other things, too. But, his development tool Frontier was a game changer for me. I eventually had to leave it behind because my brain needs color. I need code coloring when I'm programming and I need color cues to work productively with an outliner (everything else, too; my emails alway have colored sections).

His newest outline, Fargo, is very cool (at http://fargo.io). It runs in a web browser and saves your files into your Dropbox. Awesome. Unfortunately, it's also black and white and serif type face. I don't like that and, honestly, I have a hard time working with black type if there is very much of it. I tried it out. Liked it. I continued to use Omni Outliner. (If I could get Frontier to use Omni's formatting, I would be so happy.)

Lately I've been working on a very outline intensive project and now I want to work on it with other people. But, it really needs to be an active outline with expanding and collapsing sections.

I discovered that I could export the outline from Omni into OPML, the basic data structure Dave uses for Fargo. I copied the OPML into the Dropbox file Fargo uses and, boom!, I had Fargo functionality.

To my eye, it was ugly but it turns out that Dave did a BEAUTIFUL THING. He made it so that you can execute Javascript from the outliner. No kidding. You just type some JS into the outline, hit cmd-/ and it runs.

Here's what I did:

$('body').prepend("<script src='http://static.tqwhite.org/iepProject/formatFargo.js'>");


That's right, I loaded a chunk of JS from my static server. That code changes this:



to look like this..



It took a fair amount of reverse engineering to figure it out but, it works like a charm.


Here's the code:

(I think the colorized picture is easier to read.)


And here it is if you want to do your own colorizing:


var colorize = function() {
    $('.concord .concord-node .concord-wrapper .concord-text').css({'font-family': 'sans-serif'});
    $('.concord-level-1-text').css({'color': 'black'});
    $('.concord-level-2-text').css({'color': '#664F58'});
    $('.concord-level-3-text').css({'color': '#456D72'});
    $('.concord-level-4-text').css({'color': '#AD9470'});
    $('.concord-level-5-text').css({'color': '#D3AF74'});
    $('.concord-level-6-text').css({'color': '#90967E'});

    $('.concord-level-7-text').css({'color': '#778'});
    $('.concord-level-8-text').css({'color': '#788'});

    $('.concord .concord-node > .concord-wrapper').css({'background': 'white'});
    $('.concord .concord-node.selected > .concord-wrapper').css({'background': 'rgb(245,250, 250)'});
    $('.concord .concord-node.selected').find('li .concord-wrapper').css('background', 'rgb(245,250, 250)')
}

$('body').bind('keyup', colorize);
$('body').bind('click', colorize);

colorize();

document.styleSheets[0].insertRule(".selected { background:rgb(245,250, 250); }", 0);
document.styleSheets[0].insertRule(".selected div { background:rgb(245,250, 250); }", 0);
document.styleSheets[0].insertRule(".selected div { color:normal; }", 0);
document.styleSheets[0].insertRule(".selected i { background:rgb(245,250, 250); }", 0);







]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/948542 2015-12-12T17:58:06Z 2015-12-12T17:58:06Z Reinstalling NodeJS and npm

Recently, I upgraded to the latest NodeJS/npm and npm stopped working. It turns out that there was a problem with the OSX installer.

After painful amounts of googling, I found that some had solved it by "tracking down" the node and npm files, removing them and then reinstalling with the node distribution download (dmg) from nodejs.org.

I tracked down the files. Do this:


sudo rm -rf /usr/local/bin/node

sudo rm /usr/local/bin/npm

sudo rm -rf /usr/local/lib/node_modules/npm


And then hit up http://nodejs.org for a new installer. You will have a fresh working installation.


PS, the problem with npm was this:

When I typed

npm init

to start up a node module, I got errors that included

Error: Cannot find module 'github-url-from-git'

Turns out that basically everything I did with npm except --version was broken in this way.

The 'delete before reinstalling' process listed above fixed it.







]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/938701 2015-11-24T17:42:10Z 2015-11-25T15:34:55Z Visual Studio 2015, .NET 5 rc1, dnu restore, asp.net missing (I can't believe it either)

It's been a half dozen years since I started a new project in Visual Studio. I was a little excited at the prospect. I like learning things and I know a lot about almost all the rest of the internet development topics.

I looked up the latest stuff and it turns out that we have a new Visual Studio and a new .NET that have taken a lot of good lessons from the rest of the world of web development. .NET 5 is out of beta and into Release Candidate 1. That's good enough for me. I expect the bugs will be small.

Wait. It's Microsoft and everything they do is stupid.

Problem Zero (which I won't detail)

I actually went through fits trying to get it all installed and looking good, but, having done that, I create a new project: ASP Web Application/ASP.NET 5 Web Application.

Problem One:

I have to do this twice. I keep code on an external drive that the file dialog navigates to as //psf.stuff... . It tells me I that "UNC paths are not supported." The second time, I typed (not navigate) the volume letter, "X:", and it worked.

Problem Two:

I build the solution. I get a bazillion (well, 204) errors. The first one tells me that "The type or namespace name 'Identity' does not exist in the namespace "Microsoft.AspNet'". Another, "The type or namesspace name 'AspNet' does not exist in the namespace 'Microsoft'". Can you imagine?

The project listed in the first error says "TestProject.DNX 4.5.1, TestProject.DNX Core 5.0" (obviously, the 'TestProject' is my project name). For the second one, it's "DNX 4.5.1" only.

I try using Nuget to add "Identity.Core" and it changes things. I screw around with that for awhile as new missing references appear until I start getting messages telling me that I have duplicate definitions. This is truly awful. (Did I mention that Microsoft always does it stupid? The package manager doesn't make sure the references are correct? Really?)

Problem Three:

I start over and this time I decide that I'm working toward .NET 5.0 so to hell with 4.5.1. I edit the project.json file and remove it. Build takes forever and I pretty much expect everything will blow up but instead, I get a message, "Dependencies in project.json were modified. Please run "dnu restore" to generate a new lock file."

This feels like progress. I right-click on the project, choose Open Command Line and type "dnu restore". It works. I return to VS and build again. It instantly repeats the exact same message. I delete the lock file and restore it. Same thing. A complete, stupid dead end.

THE SOLUTION (and a lesson is the complete depth of Microsoft stupidity)

I reverse the order of the references to 4.5.1 and 5.0 so that 5.0 comes first. IE,

I change from this:



to this:



The build succeeds promptly. Clicking the IIS Express button opens a web browser and shows me the scaffold web page.

This has taken me over 2.5 hours. Certainly, my inexperience with this technology made it slower. Someone better might have done it more quickly. However, this is the scaffold. This is the part that supposed to save time. This is an epic fail on Microsoft's part. I mean, they put the dependencies in the scaffold in the wrong order!!

The good news is that, if I got enough Google-friendly text in this page, you might have found it well before 2.5 hours elapsed.

Of course, that just means you need to endure Microsoft's next awful surprise. Good luck. I know I need some.



]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/935679 2015-11-18T17:47:30Z 2015-11-18T17:47:30Z Bash List of Files in Directories with Complete Paths

I don't want to forget this and it took me too much googling to find.

find $PWD -type f | grep xsd

gets:

/..absolute path../Collections.xsd
/..absolute path../Composite/SIFNACompositeObjects.xsd
/..absolute path../Entity/SIFIdentityManagement.xsd
/..absolute path../Report/SIFNAassessmentSummary.xsd
/..absolute path../SIFglobal.xsd

Note that it is looking inside folders.


(Too much googling but, thanks StackOverflow: http://stackoverflow.com/questions/246215/how-can-i-list-files-with-their-absolute-path-in-linux)

]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/933367 2015-11-13T19:55:57Z 2018-02-10T22:35:47Z NodeJS Express Body-Parser Post Data Missing Problem

I am amazed that NodeJS' main network server package, Express, does not handle Post data on its own. I just don't get it. It requires a package called Body-Parser.

I copied in the sample code from the Express website

http://expressjs.com/4x/api.html#req.body

got the required packages and made a little form to test it.

It did not work.

The docs explain that Body-Parser builds a request.body that contains the post data but it was empty. It did exist, but it was empty.

I did one million things to make sure that I was doing what I thought I was doing. Postman? Check. Curl? Check. Inch by inch inspection of my entry page? Check.

I got to looking at the post on the way into the page in Firebug. I noticed that the encoding that Firefox was using was

application/x-www-form-urlencoded

The Body-Parser docs say that any of their decoders will take a type parameter to specify this. I found an example and tried it out:

{type:'application/x-www-form-urlencoded'}

Nope. I tried this in some decoder called raw(), the urlencoded() one, I even put in into json() just in case. Nada.

At my wits end, I'm just trying things in Postman. I tell it to encode it in various ways and, Voila!!, when I choose

multipart/form-data

It works.

WTF? I think. Everything specifically tells me that Body-Parser SPECIFICALLY DOES NOT DO MULTIPART form data.


How much clearer could it be?

Then I realize, the sample code from the Express docs caused me to install (last night when this nightmare began) something called Multer. Experimentation tells me that this is the reason I can do multipart/form-data. Without Multer, I get nothing.

But, without Multer, I get nothing no matter what I do. I did everything I have tried before and could not get Body-Parser to work. With Multer, only multipart/form-data. Without, nothing.

If anyone can enter a comment telling me how I got this wrong, I would be grateful.


UPDATE: It was none of the above!!!


It turns out that, in the course of the above screwing around, I moved the assignment of the router to follow the assignment of the Body_Parser. It executes the router before the parser if you tell it to. I had inserted the new body-parser code after the route. No reason, it just happened .

app.use(bodyParser.urlencoded({ extended: true }))

must precede

app.use('/', router);


]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/928166 2015-11-05T17:11:43Z 2017-09-25T14:46:57Z “This project is incompatible with the current version of Visual Studio”

Nice error message Microsoft. Why not just say, "Screw off. We don't care about your old projects." This is why I avoid them whenever possible.

However, in this case, it wasn't possible. It was some reference code for the protocol of a new thing I'm working on. It was done in Visual Studio 2013. I'm using Visual Studio 2015.

After doing a bunch of stuff that was totally useless (I am not very strong in .NET), I happened up on this simple solution...

Open the .csproj file in a text editor.

Change

12

to

14

Done.

Or, if you need more detail:

Change

<Project ToolsVersion="12.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">

to

<Project ToolsVersion="14.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">

After that it opened with no problem and I was able to build and use the project.

I have no idea what the ramifications of this could be in the long run. If I find any that are adverse, I will write it down here.




]]>
TQ White II
tag:tech.genericwhite.com,2013:Post/898773 2015-08-28T17:52:26Z 2015-08-28T17:52:26Z mysql: Load Data From and Select ... Into Outfile

Just so I don't forget...

I did a test of this.

select firstName, lastName from users into outfile 'tempTest9999'

Then, after creating an appropriate table...

load data infile 'tempTest9999' into table tempTest

The thing I don't want to forget is that the file was put into the directory...

/var/lib/mysql/DATABASENAME

The owner and group were both 'mysql'.

I know, it's what one would expect, but I forget these things.

]]>
TQ White II