Some notes about Digital Ocean servers

I’ve been unhappy with my current website host. My old hosting company, Verio, sold part of its webhosting business (including my websites) to another company. Verio used to provide php by default. The new host does not. I decided to try out another cloud provider, Digital Ocean, to see what they can provide.

I spent a while reviewing their help pages to prepare myself for surprises. I spent enough time on their documents site that they gave me a $10 credit when I set up a new account. Eventually, I set up an account with them and received the credit. Click on the link above to get your own $10 credit. (I’ll get some credit, too.)

Setting up a droplet is just as described in their setup page. It’s almost exactly the same. Digital Ocean provides more machine size options ($320/month and above) for monster machines with dedicated CPUs and/or high RAM requirements. I don’t need that. I want a small Ubuntu machine that I can use to hold the domain names that I have and don’t use.

I like their suggestion to use public/private keys. I did not realize that each machine should use exactly one key pair. I had set up key pairs for Bitbucket, Github and now Digital Ocean. I was unable to log in automatically with the key pair until I replaced the public key with the default public key I created on my machines a long time ago. I still set up a password for the non-root account. I still have to key it in for ‘sudo’ stuff, which is annoying, but login works automatically and well.

I decided to follow their instructions for setting up a server firewall. I’m not familiar with IP tables and Digital Ocean recommends using ufw. I followed their instructions and discovered that new terminal windows were not logging in automatically with ssh. It kept timing out for some reason. I rolled back my ufw changes, but I still had trouble with logins. I sent in a help ticket and received some additional instructions that look like they work. They have so far, so that’s good.

I continued setting up the server. Apache installed without trouble, even though they recommend Nginx. (I don’t know Nginx yet.) I skipped MySQL because I did not need it for a placeholder / testing site. I followed their instructions for installing PHP and … discovered that Ubuntu 16.04 does not have default repositories for PHP5. It has repos for PHP7. I had to add an additional repo for the PHP 5 files. Not a big deal, once I knew what was going on. Finally, I have PHP5 installed on the placeholder site. For a placeholder site, I like it.

Python errors: I should have known that

I needed to write some data to a MySQL database. I set up the MySQL Python connector without any real trouble. I tested it in a tiny Python page and it does connect to the correct database. Great!

I moved the connection code to a function on a new page and ran into trouble. I kept seeing a message saying something about “Reference error: weakly-referenced object no longer exists”. After a detour into weak references, I realized the issue was garbage collecting. Somewhere within my function, I had a object that was disappearing.

The connection function had no parameters. A connection object local to the function was created and the resulting cursor was returned. You should see the problem immediately. The connection object what the item that was disappearing. I rewrote the function to send back the connection object and then extract the cursor from the returned object. That error message went away.

I had another strange problem where updating a field was not allowed because of a type mismatch. I’m used to PHP, where weak or loose types are the norm. Once I realized that data going into the MySQL table also needs to match the correct type expected by the MySQL column, my problem is solved.

So I decided to install Java …

I’m not a Java coder. I work primarily in PHP, with Python on the side. I’ve been rewriting an old PHP project to conform with modern standards, including testing. I decided to use Codeception for PHP testing, mainly because it looked like PHPUnit was included and unit/integration/acceptance testing looked easy (if you follow the example given in the website). I had used Gherkin to write BDD tests, so I was happy to see it with PHP. I also saw auto-testing with PhpBrowser and … Selenium!

I had used Selenium for testing several years ago. I remembered that all I had to do was fire up the Selenium server and run the tests to catch any Javascript browser issues. I remember that being very helpful, so I downloaded the Selenium server and tried to fire it up. … Wait a minute. I don’t have java on this machine any more? The OS upgrades probably turned it to toast. OK. I’ll install that first.

I spent a few days thinking about how to install it. Since I use brew to install other command-line applications, I wondered how brew would handle a JDK installation. It turns out that my Google search: “jdk homebrew” came up with several web pages that did not fully work. I had to put together instructions from these pages.

For starters, I ran into a command that mentioned casks. Eventually, after another Google search (“homebrew install cask”), I discovered that “cask” is a way of managing graphical applications through brew. Anyway, after some review, I finally have Java (1.8.0_131) running on my El Cap box. Yay, me! You’re welcome to try these or to review the pages linked above to figure out yourself. It looks like it will work either way.

brew update
> brew install caskroom/cask/brew-cask  …  (probably did nothing)
> brew tap caskroom/cask
> brew cask install java

Note that I did not choose to install “jenv”, which creates virtual environments similar to “virtualenv” or “venv” (?) with Python. Somewhere in one of these pages, there was a note that I needed to install Java 7 first. I never found a reason why it was needed, so I skipped it. We’ll see if I do need it.

I should have known that.

Recently, I upgraded my laptop to run macOS Sierra. Along with the OS updates, I also used brew to update python3. I was not paying attention and ended up upgrading PostgreSQL to the newest version, 9.6.2. I was unhappy that I was not paying attention, but I was glad that upgrades to PostgreSQL and MongoDB were also done with the upgrade to Python 3.

I use brew to start and stop PostgreSQL as needed, so I started it to check to see how it ran after the update. ‘psql’ gave me a version number, but I could not get it started to show me the schemas or tables from previous work. I also saw the following error message:

psql: could not connect to server: no such file or directory
Is the server running locally and accepting
connections on Unix domain socket “/tmp/.s.PGSQL.5432”?

Very strange error message, so I started doing Google searches to see what could be wrong. There some dead ends involving checking for a PID file and checking permissions in /usr/local/var/postgres. Eventually, I ran into what looks like the answer in two related locations: a blog post describing upgrades to PostgreSQL and another one describing the upgrade process using brew. Apparently, I had two versions of PostgreSQL on my machine.

> brew info postgresql

confirmed it. Both pages described what I have to do, so I’ll do it. I should have known I needed to do this. Now I know.

Windows 7 updates, resolved

I’ve had some trouble with Windows 7 before. (tl;dr: I installed W7 on a VirtualBox VM. After several different mistakes, it finally installed, but no updates are downloading.) After some review and investigation, it looks like Windows update is working finally. I learned quite a bit.

For starters, it turns out that I had Windows 7 Service Pack 1 installed right from the DVD. That saved some problems, but I feel foolish for not knowing that.

I kept trying google searches to figure out what was going on with the Windows update problems. Eventually, after several months, I stumbled across a pinned reddit post that explains what to do. It makes sense and I compare it to other search results. Those other pages independently confirm what I need to do: download and install a series of software update rollup files and install them piece by piece. If I do it correctly, it will work.

I quickly have trouble with an ActiveX control. For some reason, I’m asked to install an add-on named “Microsoft Update Catalog”, even though I already have it. It’s already enabled, but not appearing in the list of running add-ons in IE 11. Fine, I switch to Chrome and continue downloading what I need.

Downloading the msu files takes time. I even set up a snapshot at one point to make sure I could roll back when one update’s notes described several errors with fixes requiring updates to registry files. I worked my way through April 2015, March / July / August / September / October and November 2016 updates and I’m ready to let Windows check by itself. It works! 29 additional updates had to be downloaded and installed. I did not have to find them. Finally!

Next steps:

  • Clone the Windows VM
  • Authenticate the Windows installation in the cloned VM (I’ve had to do this once already on a VM that I eventually kicked out the airlock.)
  • Install SQL Server 2014 available from Visual Studio Dev Essentials
    • Can I say how impressed I am with the Microsoft developer tools? I’ve always been an Apple guy, but I have not been excited about Xcode and Swift 3. The Microsoft stuff is new to me, so it looks exciting.
  • Study and practice for the MCSA exams in SQL Server 2012/2014

Interesting GitHub question

I have most of my private repositories hosted on BitBucket.org. They provide free space for private repos. GitHub requires a paid plan to host private repos. I have lots of private repos. However, it seems that everyone wants to see coding samples. GitHub no longer requires a paid plan to host public (open source) repositories, so why not take a fresh look?

I don’t want to show off all my mistakes in the public GitHub account, so I’m keeping the private BitBucket account. How do I push additional BitBucket snapshots that are sort-of cleaned up to the public GitHub account?

Step 1: Set up a public GitHub repository
That’s easy. I already have a GitHub account that I used when I was taken the Berkeley Coursera class. The account was private while I was taking the class, but became public when the class ended. Setting up a new repo in GitHub is easy and they provide help pages if you have questions.

Step 2: Figure out how to push the local repos
I already use GitHub Desktop (for Mac) to clone tutorials. In the end, I decided not to use it to clone and push the BitBucket repos to GitHub because I was concerned the alias to BitBucket would be overwritten.

Step 2a: Use the command line interface to create another alias
GitHub (and BitBucket) will automatically create a remote alias when creating a remote repository and use it to push snapshots up to that repository. I’m already using that alias (origin) and was concerned that GitHub Desktop would overwrite the alias to github if I reset it using GitHub Desktop. However, I can use the CLI to create a new alias.

> git remote add github git@github.com:rachavez/tournament-score-keeper.git

When I check the aliases using “git remote -v”, I see another set of aliases pointing to a different server.

So, while I’m stuck with dead end and intermediate steps, commits go to BitBucket. Once I figure everything out and arrive at a good stopping point, the commits all get pushed as a block to GitHub.

> git push github master

I can live with it. Even better, I noticed in SourceTree (Atlassian/BitBucket’s Mac desktop app) that the new alias appears in the remotes list. Potentially, I could update SourceTree settings to go back and forth between both repositories. Interesting.

Adventures with react and webpack

I decided it would be useful to relearn front-end web design/development. The big thing now is Javascript and various frameworks associated with it. I did not pick Angular, partly because of the schism between Angular 1.0 and Angular 2.0. “Schism” is a little harsh, but I don’t like how you have to pick a side to work with Angular and the people on the other side won’t help. It reminds me of the Python 2/3 split, but that’s slowly being healed.

Anyway, instead of Angular, I decided that Ember was the way to go. However, when I looked around, I did not see many Ember projects taking place. That’s my usual luck: given a choice in tech, I’ll pick the unpopular one.

I also started volunteering in an open source “collective”(?) for the presidential elections. I told them I had LAMP experience. They are interested in mobile experience. The first project I joined wanted React people. I did not know React, but I saw that it was popular, so I decided to learn it.

So, I’m learning React and React router, which seems necessary for SPAs. I decided to implement what I’ve learned in a little project for testing purposes. I have a group of friends that play board games. They like to get together over long weekends and play lots of games. Based on the game results, someone is declared that weekend’s winner and gets … a tournament cup. My thought was to build a React application that could track the tournament scoring.

So, I read a couple of React tutorials and I set up a new site following their model. My code almost matches theirs, but I can’t get it to load. I see that webpack can collect and load the files, so I include that. I (finally) get webpack working, after fixing many typos. Now I want to get a sample page working with bootstrap 3.

That turns out even more difficult. Fortunately, the collective had figured what to do, so I followed their config file and eventually made my file links to bootstrap work. Lots of additional stuff is required to get bootstrap to work properly.

  • babel: expected, since I’m using Javascript and webpack
  • style-loader and css-loader: What? Umm, OK. It works for the custom CSS file, but won’t load the bootstrap file properly
  • file-loader for the bootstrap glyphicons: Umm, OK.
  • url-loader for the woff2 and woff files: What are those?

It all works again, but wow, what a mess.