Category Archives: Uncategorized

Review: Weyes Blood- And in the Darkness, Hearts Aglow (Sub Pop, 2022)

Natalie Mering’s fifth LP grapples with the separation and alienation we’ve all been feeling thanks to the yawning distance- both physical and cultural- that has thrown modern society such a loop the past few years. While Mering muses about how to cope with unprecedented heart-heaviness, her music takes that darkness and floods it with light and transcendence.

These piano-driven songs, completely modern and hearkening to less mechanized times all at once, balm the wounds with a warm bath of strings, background singers, bells and harp (courtesy of Mary Lattimore) and, of course, Mering’s dulcet and clear voice leading us through. The chord progressions of “Children of the Empire” recall the best of Brian Wilson’s late 60s work with Van Dyke Parks, but temper the elated feelings with the cosmic heartbreak of minor/diminished/augmented progressions inspired by George Harrison. But these influences find a new kind of synthesis that only Mering could accomplish that yields feeling in equal measure with her impassioned lyrics. “God Turn me into a Flower” gets some inspiration from old-time gospel, updated for our times. The words evoke the same concept as “bend like a reed” of Taoism.

As long as I stand to face the crowd
To know my name, to know its sound
It’s good to be soft when they push you down
Oh, God, turn me into a flower
It always takes me, it’s such a curse to be so hard
You shatter easily and can’t pick up all those shards

Simple organ chords and Mering’s passionate voice starts the song perfectly, and it builds to a nature-meets nurture crescendo that’s the most moving thing I’ve heard since Spiritualized’s Songs in A&E.

Kudos also to Subpop for the quiet, distortion-free vinyl. Absolutely stunning. Without a doubt this is my favorite album of 2022.

Winter project: daap radio station

In late October of last year, I had a tangle with gift wrap and a set of stairs, and I lost, breaking and dislocating my right ankle in the process. Ever since I’ve been stuck at home for the most part. First there was a the trip to the emergency room, where my broken, dislocated ankle was put back in proper position and put in a splint and cast. Next there was the surgery- added to my natural arsenal was a plate and some screws. Re-splint, re-cast. Finally, after enough time, I was told to wear a CAM boot, which incidentally weighs at least twice as much as a cast. So I’ve had a lot of time to consider my situation. (Actually not a lot of time- I made arrangements to work from home once I could sit in a chair and access a laptop.)

One thing I had wanted to experiment with was finding a way to improve my “radio station-” really just an FM transmitter connected to whatever was handy. I have a bit of a reputation in the neighborhood for providing a Christmas radio station. Being near the crest of a bluff, the reach of this little box is remarkably good, spanning the immediate area, neighborhoods downhill from us and even on the ridge across from us. So I feel a bit of pride and an obligation to maintain and improve.

Two Christmases ago I moved from Windows- basically an m3u playlist dragged and dropped onto Winamp and shuffled and looped from December through January. During that time, Winamp needed at least three restarts, and Windows 8 would cough up a hairball and die about once every two weeks. I had grown tired of tuning into the station only to hear dead air and realized it was time to up my tech game. I had this windows laptop around when I got the wild hair to play radio Santa and spent enough time compiling music for this project, let alone tweaking the setup. But I’d had enough.

At first it was a simple alternative. I wanted to use the same machine of limited capabilities, so it had to be very low resource. I installed Ubuntu and booted it straight into the console. I used a simple console player- MP3123- and invoked one command on startup, for MP3123 to play all of the files it found in a certain subdirectory. And it worked pretty well. No dead air. No restarts. But there was zero flexibility. playlist management consisted of adding and deleting files. And while I do have a few radios and even an FM tuner in my stereo, I really wanted to access the station in more modern ways.

Enter DAAP- Digital Audio Access Protocol. It’s most commonly known as the transport protocol used by iTunes. Fortunately, it also has wide platform adoption beyond Apple. What really sold it was finding a Linux daemon called daapd that supports MTP clients. One example of that is my ten-year old Roku Soundbridge, which has been gathering dust in a cabinet for a few years now. Another is an app in the play store. Yet others are a couple of Linux programs. So I could set up a server to make the playlist available via several devices; one of the less mobile ones can be the dedicated radio station device that will output right to the FM transmitter. So many bases covered!

forked-daapd supports these kinds of clients:

DAAP clients, like iTunes or Rhythmbox
Remote clients, like Apple Remote or compatibles for Android/Windows Phone
AirPlay devices, like AirPort Express, Shairport and various AirPlay speakers
Chromecast devices
MPD clients, like mpc (see mpd-clients)
MP3 network stream clients, like VLC and almost any other music player
RSP clients, like Roku Soundbridge

Here is a list of working and non-working DAAP and Remote clients. The list is probably obsolete when you read it 🙂

Client Developer Type Platform Working (vers.)
iTunes Apple DAAP Win, OSX Yes (12.1)
Rhythmbox Gnome DAAP Linux Yes
WinAmp DAAPClient WardFamily DAAP WinAmp Yes
Banshee DAAP Linux/Win/OSX No (2.6.2)
jtunes4 DAAP Java No
Firefly Client (DAAP) Java No
Remote Apple Remote iOS Yes (4.2.1)
Retune SquallyDoc Remote Android Yes (3.5.23)
TunesRemote+ Melloware Remote Android Yes (2.5.3)
Remote for iTunes Hyperfine Remote Android Yes
Remote for Windows Phone Komodex Remote Windows Phone Yes (2.2.1.0)
TunesRemote SE Remote Java Yes (r108)
(https://github.com/feihugao/forked-daapd)

I even had the closest I could imagine ME having for a host computer- a fresh install of Ubuntu Server 17.04 on a headless IBM desktop. I had already set up SSH and was working on it remotely to set it up. It was ready to be used for something fun. The only problem was it was in the basement on my bench. I was upstairs in a cast. So however I did this, it had to be completely remotely. A challenge!

The first challenge was how to copy files. I could set up an rsync command to clone the collection over to the remote machine. But I wanted to pick and choose as I went, and a facility for moving things around on the remote machine quickly if needed. So I installed an FTP server. All I had to do was install:

$ sudo apt-get install vsftpd

Installing forked-daapd

https://github.com/ejurgensen/forked-daapd/blob/master/INSTALL

If you are the lucky kind, this should get you all the required tools and
libraries:

sudo apt-get install \
build-essential git autotools-dev autoconf libtool gettext gawk gperf \
antlr3 libantlr3c-dev libconfuse-dev libunistring-dev libsqlite3-dev \
libavcodec-dev libavformat-dev libavfilter-dev libswscale-dev libavutil-dev \
libasound2-dev libmxml-dev libgcrypt11-dev libavahi-client-dev zlib1g-dev \
libevent-dev libplist-dev libsodium-dev libjson-c-dev libwebsockets-dev

Optional packages:

Feature | Configure argument | Packages
--------------------|------------------------|---------------------------------------------
Chromecast | --enable-chromecast | libgnutls-dev libprotobuf-c-dev
LastFM | --enable-lastfm | libcurl4-gnutls-dev OR libcurl4-openssl-dev
iTunes XML | --disable-itunes | libplist-dev
Device verification | --disable-verification | libplist-dev libsodium-dev
Live web UI | --with-libwebsockets | libwebsockets-dev
Pulseaudio | --with-pulseaudio | libpulse-dev

After installation, edit the configuration file, /etc/forked-daapd.conf.

Note that ‘sudo make install’ will not install any system files to start the
service after boot, and it will not setup a system user.

forked-daapd will drop privileges to any user you’ll specify in the
configuration file if it’s started as root.

This user must have read permission on your library (you can create a group for this and make the user a member of the group, for instance) and read/write permissions on the database location ($localstatedir/cache/forked-daapd by default).

If your system uses systemd then you might be able to use the service file
included, see forked-daapd.service.

Otherwise you might need an init script to start forked-daapd at boot. A simple init script will do, forked-daapd daemonizes all by itself and creates a pidfile under /var/run. Different distributions have different standards for
init scripts and some do not use init scripts anymore; check the documentation for your distribution.

For dependency-based boot systems, here are the forked-daapd dependencies:
– local filesystems
– network filesystems, if needed in your setup (library on NFS, …)
– networking
– NTP
– Avahi daemon

The LSB header below sums it up:

### BEGIN INIT INFO
# Provides: forked-daapd
# Required-Start: $local_fs $remote_fs $network $time
# Required-Stop: $local_fs $remote_fs $network $time
# Should-Start: avahi
# Should-Stop: avahi
# Default-Start: 2 3 4 5
# Default-Stop: 0 1 6
# Short-Description: DAAP/DACP (iTunes) server, support for AirPlay and Spotify
# Description: forked-daapd is an iTunes-compatible media server for
# sharing your media library over the local network with DAAP
# clients like iTunes. Like iTunes, it can be controlled by
# Apple Remote (and compatibles) and stream music directly to
# AirPlay devices. It also supports streaming to RSP clients
# (Roku devices) and streaming from Spotify.
### END INIT INFO

after starting play, press the “*” button. Options will come up. This works for photos and music

#!/bin/sh

# save new source playlist

#grab date
now=$(date +"%m_%d_%Y")

# check for new items by building new playlist based on date
find /srv/Music/ -iname "*.mp3" > playlist$now.m3u

# shuffle the playlist
shuf playlist$now.m3u -o stationlist.m3u
#reset perms
chmod 777 stationlist.m3u

add a cronjob to to the daapd user:

30 00 * * * /srv/Music/newplaylist.sh

Make sure the daapd user has access to the cache folder:

chown -R daapd:nogroup /var/cache/forked-daapd/

For this distro, I can just restart Cron and it will pick up the script schedule:

# service crond restart

That’s it! Now just find your server on your daap player and let it play!

This was a fun project, though I have subsequently moved on to another project with more radio station-type automation, but the daap server project saw me through that Christmas season with no hiccups at all.

Fun with Python- Pivot table to csv

I recently had the need to enumerate all of the filespace being used on a server. I needed it broken down by folders, and within those folders, the space consumed by popular file formats, mainly, image formats and PDFs. The server OS is Windows Server and I had to use Windows 10 as the platform to do it from.

I have played a bit with Powershell and appreciate how sophisticated programming techniques are available right there in the command line. I googled a bit and found some bits and pieces to guide to my goal, which was basically to iterate over all the folders and spit out filenames and filesizes if the filename matched a variety of necessary patterns (.jpg, .gif, etc.). This is what I came up with:

 get-childitem -path w:\wwwroot -Include *.jpg,*.png,*.pdf,*.bmp,*.gif -Recurse | where {!$_.PSIsCo
ntainer} | select-object FullName, LastWriteTime, Length, Extension | export-csv -notypeinformation -path c:\local\allfi
les.csv | % {$_.Replace('"','')}

In a nutshell, this command does this:

  • get-childitem- used to look at a folder’s subfolders
  • path- searches over a certain path
  • Include- looks for that list of filetypes
  • Recurse- look through all subfolders
  • PSIsContainer- looks for items that match filters
  • select-object- used to select the various properties of objects. Here I’ve included Fullname (something like “w:\wwwroot\folder\yadda.jpg”), LastWriteTime (a date/time stamp), Length (the size of the file in bytes) and Extension (.jpg, .pdf, etc.) That last one pretty much makes this whole thing possible, as now I can sort and report totals by filetype.
  • export-csv- yep! I am exporting the results to a csv so I can monkey with it in Excel.

After running the command, it yields a file which looks like this:

site1,documents,9352,.pdf,W:\wwwroot\site1\documents\GreatFile.pdf
site1,documents,44567,.pdf,W:\wwwroot\site1\documents\BiggerFile.pdf

This can be put right in Excel and sorted, culled, summarized, etc. However… this document has 221,000 rows. It would take a long, long time to wrangle that by hand. And sure, some of it Excel can do out of the box, and for other things, there are macros… I might have been able to piece together various tools and code and get it done.

That’s when I thought of Python. I had already used Python to transmogrify text to do my bidding. Surely there was a way I could auto-magically achieve what I needed in a couple of steps or so. And indeed after some digging, I found some fun things using pandas (as pandas.pydata.org describes it, “pandas is an open source, BSD-licensed library providing high-performance, easy-to-use data structures and data analysis tools for the Python programming language”). I already have Python 3 and pandas on my machine to do some things with analytics.

From my days as a trainer what seemed to be needed here was a pivot table, and sure enough, there is a pivot table dataframe for pandas that will do the job. This is the script I settled on eventually- it took all of four lines.

import pandas as pd
df = pd.read_csv (r'C:\local\allfiles.csv')
print (df.pivot_table(index='Site',columns='Extension',aggfunc=sum))
df.pivot_table(index='Site',columns='Extension',aggfunc=sum).to_csv('newoutput.csv')

Important parts:

  • df means dataframe. In the second line (after including pandas in our recipe), a dataframe for reading the generated csv file is invoked.
  • I started with the print line just to get visual confirmation it was working. I just kept it to serve the same purpose in the final version.
  • then the pivot table dataframe is created. the “Site” column is the index, and the contents of the “Extension” column are aggregated and summed for the output table- the filename of which is then specified.

Pandas is a powerful library. But what I find interesting is how it distills what is a bit of brainteaser of a concept into some clear directives, thus yielding a bit of insight into how a pivot table is structured. I guess that’s obvious, still I find it- informative? Fun? something.

But we’re not quite done here. it does output a lovely product:

pivot table printing to screen

What the heck is “4.094785e+08?” That is scientific notation. Excel will do with exceptionally large numbers if no specific format is selected. Since what’s happening here is a decrease of significant digits for the sake of brevity. I think this is a standard way of depicting these numbers when the format- in this case, the rows and columns of a pivot table- matter.

NaN means Not a Number- in this case the value is zero- nothing there.

The scientific notation bit threw me for a loop at first. I googled my fingers off looking for a way to convert the numbers to a specific format before exporting. There are some ways to do it, but I had a lightbulb moment, realizing unlike all the sorting and summing, changing number formats for a range of values is no problem at all in Excel. So I went ahead and imported it. I changed the format from General to number, and things started looking a lot more normal.

But it still didn’t strike me as being as “at-a-glance” as my needs dictated. On a Stack Overflow discussion I found several helpful tips about converting number formats to common filesize units- kilobytes, megabytes, gigabytes, etc. I decided MB was good enough for my purposes, though there are some elaborate solutions that take into account everything from bytes to petabytes- I highly recommend it. This example was what I needed:

[<1000]##0.00"  B";[<1000000]##0.00," KB";##0.00,," MB"

This will help all the numbers in this format to express themselves in terms of MBs.

After removing some extraneous bits- the filename and subfolders past the main site folders really weren’t of any use in this report; consolidating the “JPG” figure with the “Jpg” figures and “jpg” figures; click-dragging a few autosums; then setting the format of the numbers, I produced a pretty nice report of the filespace in several server folders taken up either by images or PDFs.

Not bad!

Don’t put all the work on the artist

Famously (in my own mind) an art teacher once said: “Art is a dialogue. Without a reaction from a viewer, a painting does not communicate, and is not art.” Perhaps the point was taken to the logical extreme, but it hammers an important lesson home: the burden of the aesthetic experience isn’t all on the artist.

I’ve been thinking about this a lot since I purchased two used books recently, both Rolling Stone Record guides. One was published in 1980, and the other in 2000.

The 1980 volume was a bible of sorts for me when I was a teenager. I didn’t get an after-school job for the usual reason: to put gas in the tank of a crappy used car, so you can go do teenage stuff, although the money from it certainly was used for that. No, I was moved to find work for the express purpose of buying a stereo and building a record collection. Today you’d just go to Blogspot blogs or Youtube to build your free music library. Not in 1980. At that point in history, if you only cared about cruising, you bought cassettes. If you cruised AND had a classic car, there was a good chance you bought eight-tracks. But if you had arguments with your parents, got grounded and had to shut the world out with some critical musical listening on a big, heavy pair of headphones, vinyl was the only way to go. But the start-up cost, especially to a 15 year old, was not insignificant. It was a deal-breaker without that after-school job.

So I resigned myself to food service so I could buy own stereo and records- and not have to listen to the archaic junk in the living room. I wanted real high fidelity! I wanted to FEEL those guitars and drums and shrieks of life down to the very core of my soul. Thank goodness for layaway. Six months later, I bought my first amp. Six months after that, a decent turntable and set of headphones. And every paycheck went into the painfully slow procurement of one- sometimes two- $9 records at a time.

If you only have one new work of art, you study it. scrutinize it. Drop everything and give it your full, undivided attention. And if you are a teenager with an after-school job in the food service industry, you have to choose VERY carefully, because if you bought a shitty album, it HURT. I mean in a real, visceral, physical way. It was painful to hear your hard-earned $9 wasted on bad music. And in those days, it was hard to trade used stuff. It involved the postal system and a lot of trust. It sometimes didn’t turn out. People wonder why Columbia House was a viable business. This paragraph explains exactly why.

Anyway, they had a copy of the Rolling Stone Record Guide at the public library, on reference- meaning it could not be checked out. So before I had a car, I took a bus downtown after school every day and read it like seminary students read the Good Book- one chapter and verse at a time. I absorbed it so fully, that even today when I think of albums like ZZ Top’s Tres Hombres or James Gang’s Thirds, I can see their ratings in my mind’s eye (both got four out of five stars). It guided my purchases back then in the dark ages when you couldn’t hear the music before buying it.

So this book- the 1980 edition- is kinda the ur-document of my musical tastes. I went to college a few years later with my musical tastes forever changed by the intriguing artwork of the albums in the import bin- generally beyond the scope of Rolling Stone’s editorial tolerances, and we parted ways. Many years later I bought a used copy on a whim- a couple of dollars for a good copy, why not?- and while I was at it, bought a 20 year more recent update just for fun.

The 1980 editors knew the historic background of pop music, and could speak eloquently about where bands fit into that mosaic. They took the long view. If they didn’t outright reject the music, or praise it to high heaven, they could at the very least understand it. This meant even if the review was mediocre, you could glean from the review enough info to realize YOU would probably love it, and most of the time, they were right.

In stark contrast, the 2000 editors did not have much respect for the long view. They either ignore it or, worse, provided their own spontaneous assessment of it. Rather than trying to find some kind of real value of the music (an impossible task, but necessary to try, especially if you are going to rank things), they seemed to favor the sort of blanket dismissal that is dispensed all too common. I think they were trying to be irreverent and snarky, but the result just reads like narcissism. The criticism is not something you can depend on quite the stark contrast between editions. But it’s prescient nonetheless- it’s like a gaze into the crystal ball. It’s not too hard to find this kind of self-absorbed dismissal on the Interwebs.

Hardly any thought given to perspective. I can understand that to a degree- after all, what is music but sensory impulses to enjoy or not enjoy at the moment it’s played? But here’s the thing- without some patience and understanding, slow-growers never get a chance. Difficult work just remains difficult. It’s the aural equivalent to never eating your Brussels sprouts and going for the cupcake every time. Oh yes, the cupcake loves to please, and you can eat cupcakes all day long if you want. But it leaves you hungry. The more difficult Brussels sprouts give you longevity and a life of substance, not fluff, and in the end you’re more deeply satisfied.

Art is a two-way street. You can’t be content with mere ingratiation, unless you really, really don’t care about meaning and quality. But why would you be content with that? Would you be fine with friends that flatter you so you pay them attention, while secretly they loathe you? No, surely you would prefer real friends that weather the hard parts and appreciate you for who you are, not what they can get from you. Try the same with the art you enjoy. Look deeper. Find patterns. Look at the larger picture. Meet the artist halfway and try to understand why they don’t want to just spoon-feed you pleasure. It will be worth it.

RSync: Backing up your media without tears

I’m sure I’m not unique in using a couple of older laptops (a Macbook and a PC) for my various professional needs. Both run Adobe’s Creative Cloud apps with no problems; One is home to a modern Microsoft Office distribution for a couple different legacy jobs I need to perform occasionally and has a SQL Server/.NET development environment; the other is my Apache/PHP/MySQL environment. I have laptops because I grab work time when and where I can, and that is frequently on the road and with people around me. I create an invisible work perimeter around me in the form of a music library and headphones.

Yeah, I have given in to the temptations of iCloud, and do manage my purchases that way- it’s nice to have those available no matter what. But it’s only a small piece of my vast media empire. These machines have older drives, too; I’m waiting for the SSD price/capacity ratio to hit that magic decimal place. So in order to provide some multi to my media, I bring along an external drive. You know how it goes- now I have media in three different places: in the cloud and on two different portable drives.

Since I work in the development world, I tend to forget that I am sometimes The User, and need some professional guidance. I had this fab idea that I needed a NAS in my home network mix, so it can do most of the heavy lifting so my drives can retire and be primary and secondary backup sources. I impulsively grabbed a Seagate Control NAS while idly browsing at Best Buy the other day. Sure, I read the mixed reviews mentioning the slow transfer times. But no one was reporting you couldn’t stream stuff once it was set up. So I prepped myself to deal with days of slow copies from the external drives to the Seagate.

It was even more miserable than I anticipated. Directly connecting the external drives to the Seagate’s USB port proved to be the worst. I was seeing transfer rates of 1MB/s. At that rate, it was going to take a week. But it worse than that- the Seagate became completely unresponsive. Clearly neither Seagate nor Windows could handle a massive queue. Chunking it up was better- the OS and Seagate at least remained responsive- but the rate was no better, and I really didn’t want to copy 500GB 1 gig at a time. Interestingly I was getting 50 times the rate of transfer when copying over the wireless network- rates at about 50MB/s.

OS X, or rather *nix in general, provided a solution. The desktop support tech in me realized the best way to do it was to eliminate the U-factor entirely and let a program that specializes in large file transfers handle the packet negotiation. The usual OS “explore” method (whether *nix or Windows) just isn’t built for the task. It bombs out, becomes responsive, and doesn’t have a very automatic way of dealing with duplicates. The user is not always available to fix or retry. Robocopy, I would like to point out, is a strong contender for this. But only my Macbook reads both external drives, so I went for the built-in Rsync utility. It’s very easy to use; merely sync the Seagate with each drive, thus creating a single drive that has one copy of each media file from the two drives.

It is trivially easy to use the –ignore-existing switch to provide exactly that sort of situation. And in OS X, all you have to do is navigate to the source and destination folders, then drag them over the Terminal in order to add their paths to the command, which ended up looking something like this:

rsync –ignore-existing –recursive -verbose /Volumes/source/audio /Volumes/destination/audio

The -verbose switch is important for your peace of mind, as you can see RSync chunking through the files, calmly and consistently until the job is complete. Granted, at 50MB/s it took a couple of days to complete, but it did not die and it did not max out the CPU or RAM. It quietly sat in the background doing its thing in a Terminal window, and I was able to work on the machine without any grief. I I highly recommend this utility for moving large files across networks.

[Tweet “RSync: Backing up your media without tears”]