I returned to pump.io in Oct 2016 and again in May 2017 when I rebuilt the page deprecating aged content . The page had remained unchanged since I had to stop my 2015 experiments. In 2015 I’d got the server software working but had not made a usable solution for either micro blogging, nor for messaging. I hope that the pre-reqs remain  node.js, npm, a database server and ‘gm’. I assume that node.js forever remains desirable.

2016

Here are some links

I decided to install on a brand new U16 image and there have been some npm install problems. I have worked out how to run mongodb in a docker container.

On my 1st U16 build I was getting errors on the npm installs but after loads of sudo ( apt-get update ; apt-get upgrade ), the npm calls work, although with warnings in the case of forever. Now onto mongodb.

mongodb

This page describes how to install and start mongodb. Use apt-get to install, I amended the .conf file to invoke smallFiles, implemented the recommended service file mongod.service and used the service command to start and stop as tests.

In May 2017, a moved the smallfiles command from the conf file to the command line in the services file.

It is recommended to run mongod in a non-root user, and that is what the systemd service file mandates and the post install triggers set up the mongo users. The first time mongod is run it performs a db initialisation, if this is done as root, you will need to remove all the files from the database sub-directory before it will run as a non root user because the daemon cannot manipulate the files previously created as root. Doh! I need to check about smallFiles as the alternate way to do this is from the command line.

mongodb file systems require support for fsync. It will not work with vbox shared folders. There are also performance reasons for not using NFS. There may be some problems in using data volumes in Docker. I used this page to guide me in building the volume structures/regime.

I built an image, using the instructions here, I pushed it to my repo and start the image first time using the following code,

$ docker run -p 27017:27017 --name mongodb-1 -d dfl1955/mongodb --smallfiles

npm & pump.io

Install via apt-get install nodejs npm using apt-get, for ubuntu symbolically link /usr/bin/nodejs to /usr/bin/node.

$ ln /usr/bin/nodejs /usr/bin/node

Install graphicksmagaick using apt-get, I originally did this using npm.

I have used the manual install route, so mkdir or

$ cd /opt/local/
$ git clone https://github.com/e14n/pump.io.git
$ cd pump.io 
$ npm install -g
$ npm test -g

and then

$ npm install -g databank-mongodb

make the jason config file, include the “serverUser” parameter and then make the user, this lives by default in /etc/pumpio.conf.json, then if you use the dafault loging location, you need to make /var/log/pump but first you need the user

$ useradd pump

then

sudo mkdir /var/log/pump
sudo chown pump:pump pump

Because of vbox, I installed and started samba & mongod obviously.

I made the pumpio config file

{
    "driver":  "mongodb",
    "params":  {"hostname": "localhost"},
    "hostname": "davevbu16c",
    "secret":  "ruislip",
    "noweb":  false,
    "site":  "Awesome Sauce",
    "owner":  "Dave Levy",
    "ownerURL":  "http://davelevy.info/",
    "port":  80,
    "address":  "0.0.0.0",
    "nologger":  false,
    "logfile": "/var/log/pump/pumpio.log",
    "serverUser":  "pump",
# don't want uploaddir anymore
    "uploaddir": "/var/local/pump.io/uploads",
    "debugClient": false,
    "firehose": "ofirehose.example"
}

Using the default port caused errors, so I transformed the listener port to 80.

2015

I have decided to give pump.io a whirl given the constraints of using other’s micro blogging services products. I last used this when it was laconica. The database server must have a databank client interface which they say means mongodb, couchdb or redis. They recommend mongo as the best default, should be fun.

 

On Ubuntu, one needs to make a link command to provide the alias of node to /usr/bin/nodejs. See more from Stackoverflow.

Installing forever, this is another node.js package and needs to be installed using npm. See also this article at exratione.com, and this article at Github. forever is needed to set some of the environment, and the use of the -a & -l flags are significant, or more accurately, the -l flag takes an argument, which is the first string after the flag. i.e. -l -a ${filename} doesn’t work. This has been bugged here…

https://github.com/foreverjs/forever

Important commands include, forever start application and forever –help. NB Global installations using npm make links in /usr/local/bin

My notes on packages and parameters have been moved to a comment, dated 2nd May 2017.

I have written a start/stop script. Wonder if anyone else will find it useful. (Probably not, its LSB compliant.)

To solve the problems related to pump.io itself I need to set the logging on. This will need prettyprint, see this and this. Looks like python will be helpful.

This might be best encapsulated using Docker.

 

Some more links to help with mongo,

  1. http://docs.mongodb.org/manual/reference/configuration-options/#systemLog.component.storage.verbosity
  2. http://stackoverflow.com/questions/17708897/how-to-get-pump-io-oauth-consumer-key
  3. http://stackoverflow.com/questions/14181047/how-to-set-permernent-dbpath-for-mongodb
  4. http://stackoverflow.com/questions/10805782/how-to-run-mongo-db-as-service-using-non-default-dbpath
  5. http://info.mongodb.com/rs/mongodb/images/10gen-MongoDB_Operations_Best_Practices.pdf
  6. http://docs.mongodb.org/manual/reference/command/copydb/

Maybe I need a separate snip for mongo.

 

10 Replies

  1. In 2017, I decided to rebuild the pump i/o page and removed the following comments from the page since they no longer represented good advice.
    I have fixed pump/io to run inside a virtual box VM, running Ubuntu 14. Now working in the office between windows machines. Some of what I say here is thus not the best advice.

    I used apt-get to install nodejs npm and then use npm to install gm & forever. i.e. npm -g install gm forever.

    This describes how to install mongodb on ubuntu 14.04, as does this page from mongodb’s document pages. The mongo install is best done from their repo. For some reason they don’t use add-apt-repository. Also the default configuration for mongodb is to use and create over 3Gb of journal files. Not so good if using a VM or a tiny AWS VM. The configuration file, /etc/mongodb.conf holds the run time parameters, there are two, nojournal and smallfiles. These are booleans, so set

    smallfiles=true

    is required to be added to the config file and then the service needs to be stopped, the journals deleted and then the service restarted. On Ubuntu the service word is ‘mongod’. Not yet brave enough to run with nojournal. Also oddly, to my mind the default location for the database is /var/lib/mongodb, which it shares with the log. Oops. It’s not where I’d put it. (I wonder how to fix this, because the install creates the database etc and downloads the *.conf file.).

    On the parameter’s in the config file, so originally set so http://127.0.0.1:31337 gets the site from the same machine, and this article, one of many talks about hiding the port number, I must dig out my snipsnap notes. Maybe not, using Apache as a proxy server is strongly deprecated. Users created while this IP address was in place were bound to the address i.e. the address was held in the database as an attribute of the user. This caused pump to hang when logging in or creating a new user. I should probably bug this on Git.

    I had a problem with access to the uploads directory and so I made a user to run pump.io under and chowned the directory to it. NB, can’t post pictures or avatars without getting the permissions right.

    Where is the mongodb held? It sustained the database and conf file through an uninstall/install cycle. The mongo client call is mongo, the language is not SQL. More Json.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.