Too Many Choices

There is a certain kind of rut that I get into when trying to make music on a desktop/laptop. It comes from having too many options in my music-making software.

My two primary programs are Ableton Live 9 (gotta upgrade to 10 still) and Reason 10. Within each program are tons of plugins, software synths, samplers, options for routing. And within each of those are the presets.

For example, If I want to make a drum beat using Kong (drum module in Reason), it loads the default kit. Most of the time I don’t want to use it. What happens next is I spend ten minutes finding a preset I like. Then another ten or so minutes tweaking the preset. It’s not particularly complicated. But, it takes time. This is time I could actually be making a new beat. Multiply this by any number of software instruments I want to use and I end up spending a significant portion of my already-limited music making time hunting for sounds.

The easy(ish) answer is that I should take a day to go through the presets of the modules that I use most and segregate, or otherwise highlight, the patches I like best. I can then use those as my starting point for new tracks. But, I suck at planning my music-making time and only think to do this after I am in the middle of a preset-hunting expedition.

This is where less-robust apps on iOS have an advantage. They don’t necessarily have a ton of sounds. Their interfaces are stripped down. I can open them up and start making a loop. If I don’t like the default patch or samples, there aren’t a ton for me to scroll through. Usually there is something I am happy enough with to get going.

Less choice is good sometimes.

Beat in Reason

I have been trying to spend more time making loops lately. A quick beat in Figure or Korg Gadget on my phone can be a replacement for mindless flipping on Instagram. I am still debating what method is best for sharing. Soundcloud is pretty straightforward and lets me control the CC license. But, Allihoopa is in some ways more integrated into the apps I use.

Here is something I made in Reason and then pushed to Allihoopa. Let’s see how nicely it embeds.

And, it looks like the WordPress auto-embed feature breaks Allihoopa’s embed. I guess click on the link above to hear the beat.

I also made this other song and put it in Soundcloud. Does it embed properly?

OK. That decides what service I will be using to share my sounds…

Twitter, I CAN Quit You

I recently logged back on to Twitter for the first time in ages, thinking that I would give it another try. After a couple weeks of sporadic use, I have changed my mind and am deleting my profile.

I have been on Twitter since 2007. I never got good at it. Never built up a network of friends (my closest friends don’t use it) or followers (I’m not that kind of clever or cool).

And now, it looks to be an awful place lead by a person who won’t step up to make it less awful.

So, I’m gone. I have a feeling that it won’t be the last social media platform that I leave this year.

GarageBand and the Beat Sequencer

GarageBand hasn’t always been my jam, so to speak. I started my music-production fiddling with a cracked copy of Cool Edit Pro (manually slicing wave files) in college, moving on to actually buying Logic (before Apple bought them), Reason, and eventually Ableton Live.

GarageBand felt like kiddie stuff compared to those programs. Not that I am an all-pro music producer. I don’t know more than I do know in all those programs. But, when there is something that I want to do in one of those program, I can almost always do it. (Except for destructive waveform editing in Ableton. I hate not being able to do that).

But, recently I have been enamored with GarageBand for iOS. Specifically, I am in love the the beat sequencer. Most DAW’s have a form of piano roll grid for entering notes. Some are easier to use than others. GarageBand’s is my new favorite. It is easy to read, doesn’t require much zooming around, and is perfect for touch screens.

Here is a bit I was fiddling with the other day:

At a glance, I can read my beat with no scrolling around or squinting. The interface also has a couple cool features that I haven’t seen in my other programs. First, the velocity and note repeat buttons at the bottom make those particular edits a lot easier and faster to implement. Second, each drum can have its own loop length. In the above example, the bottom row (a rim click) is only three 16th notes long. That cycle repeats independently of the rest of the hits which are a full measure long.

Will I ever write a full song on GarageBand on iOS? In the past I would have said no. But, little features like this make me reconsider that thought.

Alexa, raise my children.

As my kids get older, as all kids get older, they become more aware of the various stimuli around them. For example, when I say, “Alexa, what’s the weather?” they look at me, and then over to the Echo when a response comes through. They are responding to their own names. Maybe they are starting to understand that we are “mom” and “dad”. But, what is their developing understanding of the meaning of “Alexa”?

Is it a name to them? Is it something we call out into the ether, resulting in a response from a real human? They interact with their grandparents and aunts and uncle during FaceTime chats. Who do they think Alexa is then?

I am not sure that I am ready for my kids to humanize electronics. The Echo is a dumb speaker that connects to the cloud to process voice requests and return responses with synthesized speech. It–and let’s never forget that it is an “it”–doesn’t have feelings, doesn’t have self awareness, doesn’t have anything even close to human intelligence. None of us should be treating our devices as if they do. Especially when we don’t have much in the way of research on how these interactions play into child development.

I am, sadly, also not ready to give up on voice-activated computer queurying. I believe that it is a significant part of our technological future. I also believe that perhaps we will develop general artificial intelligence in our lifetimes. If (when?) that happens, I am open to treating those new beings like I treat my fellow human.

For now, I made a change that I hope will at least help my kids understand the difference between interacting with humans through technology and interacting with the technology itself. I renamed “Alexa” to “Computer”. You can make the same change in your settings on the Alexa app on your phone.

My kids don’t yet know that “Alexa” is a human name while “Computer” is what we call a class of objects. But, as they are developing, and Jessie and I point to things or people or animals and assign them names, this distinction between the “computer” and “grandma” will, I hope, make sense to them.

I just don’t want them to think that machines equal people, or that grandma has a wake word and takes commands.

Update: I drafted this post a couple weeks ago. Since then, the kids have further developed their relationship with the Echo. Now, when one of us calls out “computer” to it, all three of them turn their heads towards the speaker before a response comes. Oy.

Robot Factory

Come take a short tour of our robot factory. Only the highest quality robots, building the highest quality robots for building the highest quality robots.

Sometimes things break. Somethings people break. But always, the work goes on.

Ableton Live Non-Song

What makes a “non-song?” In this instance, it is the lack of overall structure. Here, I am, somewhat randomly, adding and subtracting loops that I recorded.

Most of the sounds come from samples of my 8-string guitar. The techno bass is the Novation Bass Station plugin. The drums are from the Ableton Live sample set.

I enjoy putting tracks like this together. They are pieces of something else. Maybe a story. Maybe a video. Maybe something that just sits there while you are doing other things.

Let me know what you think.

I’m Blogging Again

Why? No one blogs anymore.

I know, but I enjoy trying to create things that are entertaining. Even if only a few friends come and visit regularly, it seems worth it.

My plan is to write about technology, law, and the occasional bit of fiction. I’m not doing that last one on my own as much as I would like, so perhaps this can serve as a commitment device (and a get over yourself device). I also hope to share music (mostly beats and loops) and pictures (mostly of my dog or random macro pics).

Hopefully I will entertain a few people. If not, well, screaming into the void on a blog isn’t much different than doing so in meatspace.

(I’ll probably read this all later and feel cheesy and delete it.)

Calling my daughter a “princess”

I want to preface this post by saying that I don’t think there is anything wrong with traditional femininity or masculinity. But, I want my kids to grow up feeling that they can be whoever they are on the inside, which may include many variations around the gender/identity spectrum. When I see traditional character traits being projected on to them (and I am sure I do it too), it makes me ponder topics like this.

We begin imprinting on our children the moment they come into this world. If a baby cries and makes a particularly scrunchy face–he’s angry. An involuntary smile (that probably happened while passing gas)–she’s sweet. Sure newborns have temperaments and personalities, but they are still largely blank slates. Personality comes only after adding up all of the little experiences, encouragements, discouragements, comments, and other stimuli (and of course the interactions with their genetics).

When our triplets were in the NICU, I could see how differently they were treated, or rather how the nurses and doctors referred to them. We have two boys and one girl. And the thing that really stood out for me was many people called our daughter “princess.”

What made her a princess, though? What does it mean to be a princess? To me, that word evokes femininity, frilly pink clothes, and being protected. It is an indicator of the “girl” corner of the gender spectrum and all of the things we associate with girls.

And if people keep calling my daughter a princess and keep buying her pink clothing and treating her as a fragile treasure, how will her little developing brain respond to this? Will she take on those traits, those things that in our Western culture we associate with the feminine? Or are there elements within her that would come out no matter how she was treated?

I have had many friends with children comment to me how different their boys and girls came out. With so little difference in upbringing, they were amazed at how their boys gravitated towards trucks and army men and their girls got into dolls and frilly dresses. Now that I have my own kids, though, I have to wonder, how soon were people calling their girls “princess?”

For further reading on actual research around how we treat our kids, check out this article in the Washington Post.

Augmented Reality and My Kids

I know what I need–Augmented Reality (AR) glasses for tending to my newborn triplets in the middle of the night. Think about it. It’s 2 a.m., one of the babies starts crying. They all start crying. I wake up, bleary-eyed, confused. I put on my AR glasses and say “baby lights on.” In my field of vision, it looks like I have turned all the lights in the house on full blast. I walk downstairs, get their bottles ready, change them, feed them, put them to bed. I never turn on the lights as far as they are concerned. But, I see everything, get enough light to wake up, and when I am done, take off the glasses and pass right back out until the next feeding time in what feels like 20 minutes.

Of course, there are technical problems here. Headset AR is too bulky right now–I am not going to strap on a HoloLens and stumble around the house. I also don’t want to horrify my children as I am leaning over their cribs with a giant mask on; I may as well wear a pair of night vision goggles. I need something that wears like a pair of glasses.

Aside from the bulk, the type of simulated lighting that I want is not generally available yet. What would it take to scan the room, use the available light to map it, and then re-render everything as if it was bathed in bright light? If it isn’t possible now, thought, it will be. Heck, the technology will eventually be able to bath the room in starlight, or make the room look like it is on the surface of another planet. If I am going to tend to my kids in a fog, maybe it can be the fog of Venus’s atmosphere.

I’m ready. But, by the time this is all available, I think the kids will be past nighttime diaper changes.