As my faithful readers know, I had a less than stellar production experience while attending the Supernatural Convention. For those of you who are not familiar, Supernatural is a television series on the CW network. The lovely wife was lucky enough to win free admission to the convention. I went along to take pictures, they can be found at http://photos.bradfordbenn.com/Events/Supernatural-Convention-Nov-2013. (At the moment the images are very raw and still need some adjustments, so do not be surprised if there are some changes

The first thing I want to clearly indicate is that the volunteers, the people who barter their services as facilitators in exchange for tickets to various events, were great. They were all very helpful. They provided information as best they had it. Much of the disappointment is about the choices made for the audio, video and lighting equipment. I am not singling out an equipment manufacturer or brand, it is a result of using equipment incorrectly. Let me stay that again, I am not saying that any of the equipment used was inferior, I am saying that the use of the equipment was not appropriate.

First lets talk about the room to get an idea of the room. The ballroom that was used is over 15,000 square feet, it can seat up to 1,900 people for theater style use. It is 105 feet long x143 feet wide x18 feet tall, it is a large room. I do not think that the room was full to 1,900 but more likely to 1,700 based on the need for video and back stage areas.

At the front of the room was a stage about 18 inches off the floor and probably 24 feet by 12 feet. Each side was flanked by a 12 foot wide by 9 foot tall rear projection video screen. Next to each screen was a powered speaker. 1/3rd of the way back against each of these wall was another speaker. Notice that the picture I took is in focus…

Wideshot

I did not get backstage to see the video system but I can tell you it was standard definition at best. It was not very bright or sharp.  There was also a constant ground loop bar scrolling on the video screen. Since the speakers were out front I will say they are a 12 inch 2 way powered speaker rated at 131 dB peak with a 75 degree conical coverage pattern. The brand does not matter as it was a quality product just being asked to perform a task it was not designed for. There was also a powered 300W floor monitor on the stage for the talent and a duplicate on the front of the stage as a “fill” speaker.

There were two Ellipsoidal Reflector Spotlights against the wall on each side for fill lighting. They were basically even with the heads of the talent and were not very bright. It was low enough that they were simply plugged into a standard 15A outlet. They were not very effective at all, to the point that most times it was simply the house lights being on all the time to see the stage.

I did not see a front of house position, so I did not see how the lights, audio, and video were controlled. However I would not be surprised to see the system used in a set and forget mode as there were often problems.

The production problems started from the beginning, the video was out of focus from the beginning. It was definitely Standard Definition and then was not clear on top of that. It was just from a camera from the back of the room. I do know that there was some video processing as a few times there was text overlaid on the video image. The best way I can describe it is saying it was like 1990’s high school video. Also about once a session someone could be seen walking through the projection cone, so the backstage area did not have any clear indication of the cone.

The audio problems started very soon after the event itself. I figured the system was just having some teething pains as the show had just started. The first problem with the audio system was the entire system sounding boomy and not as clear as the equipment could provide. Much of it I think due to system trying to cover a space that is too large.

Two and a half hours in to the event and my first questioning of the system approach started. There were wireless drop outs, a dead microphone, and audience/question microphones at the edge of the room. The problem with the audience microphones being were three items in my opinion. The first was they were not loud enough in the talent foldback monitor, they were wireless when they could have been wired, and they were located so that the talent was always looking away from the main audience.

Let me explain the looking offstage comment. By placing the audience microphones at the front of the seating area and at the outer edges of the room, the talent was often looking off stage not at the main audience. The reason for the talent looking off stage was that they were being polite and having conversation and making eye contact with the question asker. The talent was doing the proper thing. The problem is that the single camera in the back of the room simply had them in profile. It kept the audience from getting to see the complete interaction.

Four hours in, the system was not sounding any better in fact it was getting more pronounced with deficiencies. I believe that part of it is the pile-on effect. The first flaw had been found so it was easier to find other ones. The use of a compressor and/or de-esser would have greatly helped the sonic performance for the guests. The audience would have had an easier time listening and there would not have been as many plosive sounds.

Fifteen minutes later the talent was literally walking off stage to listen to the guests directly as the monitor was not reinforcing the comments to the main stage. The audience comments were audible in the house system but not in the monitors. Of course there were also times that the audience microphones were not working at all.

The last presenter of the day had some audio sources with him. Now I am not going to say that I understand all of the voodoo that the talent was using with his ghost hunting audio devices. The approach was to literally have the talent hold the handheld battery powered speaker up to the microphone for the audience to hear.

One of the things I did not mention was how often there was a ground hum, it was not constant it would come and go throughout the day. It got worse during the 2nd day when the entire house left audio system was replaced by a ground hum. Yes, no audio for the left side of the house.

That night there was a Karaoke event. It was a lot of fun, but it could have easily been much better with better equipment. The same system was being used to reinforce the Karaoke event. There was no low end, the system was in full clip throughout the evening. I am not sure where the clip was occurring, it could have been the sub feed from the Karaoke system they brought in. Either way it was audibly distorted. I am very glad I had ear plugs in. Especially when the feedback started. It was not momentary feedback.

The second day started with the wireless microphone failing and needing to be replaced 10 minutes into the first session. Yes, ten minutes. Then came more feedback. It got to to the point where the presenters were making fun of the audio quality. Yes, from the stage talent was making comments about the system performance. It obviously was not the first time these problems have occurred.

The same issues occurred on the 2nd day of the event. So rather than hash through all of the issues, you can read the tweet stream at the previous blog post (Tweets against the audio machinery).

That night there was a concert with Louden Swain. There was no music audio system, it was the same system as the rest of the convention. Many times the stage volume overwhelmed the public address system. The talent was actually adjusting the aiming of the speakers to improve the sound and I think they did a decent job.

After the concert there was a limited attendance event, with a separate PA system that I believe was brought in by the DJ for the event. This system was able to keep up much better, not only was the room smaller the equipment was more suited to the use. The system was two Self powered 15, two-way system with a maximum output of 132 dB. It was much better not just for voice but for music as well.

The third day was much the same in terms of performance. However the issues with the monitors and feedback got to be so bad it was comical. One panelist asked if they were going deaf as they could not hear a single question, the audience started relaying the questions for them. During a two person panel, the talent heard so much feedback they started doing synchronized microphone movements “ringing out” the monitors to try to fix the issue. At one point during a break in the panels, feedback rang out with no talent or microphone on stage. It was so loud and painful that guests were screaming from fear and pain.

The reason I bring these up is that the audio and video system actually impacted the guest experience. No one there other than the wife knew what I do, and yet there were still conversations going on around me about the problems with the audio and video. People were talking about how bad it was, why were there so many problems, this convention happens multiple times…etc. The event became a caricature of poor audio and cheap conventions.

Many of the problems could have been avoided simply by selecting different equipment. The equipment was reputable just not the right selections for the room and use. This convention is a key example where renting a good system for the space would have greatly improved the experience. I am not naive and realize that this event is for profit and realize that by reducing the equipment costs means more profits. The fact that tickets ranged from US$650 to US$150 for all three days plus additional fees for the autographs and picture opportunities makes me feel like the frugality is unwarranted.

As some of you who follow my Twitter feed, I went to a fan convention with the wife last week. I am still gathering my thoughts and writing a blog post about the experience. However I wanted to gather all of the Tweets together in one location for those that might has missed some of the experience. So presented in chronological order and unedited are my Tweets about the event. Dates and times are Pacific Standard.

====

21 November 2013 20:12

For those of you who think my twitter stream is eclectic, brace yourself.

@GentlyMad is taking me to a fan convention for Supernatural…

====

21 November 2013 20:15

For those of you following along the link is http://t.co/r9p51GywY3 i am looking forward to meeting @feliciaday the rest is unknown.

====

21 November 2013 20:24

1st tweet of #BURCON, waited in line to register and @GentlyMad’s not available yet. Could have still been drinking, watching hockey.

====

22 November 2013 12:27

Instead of listening to @AVNationTV live podcast @GentlyMad has taken me to #BURCON and the video is out of focus

====

22 November 2013 14:39

Must resist urge to go tweak audio  at this #BURCON event. I think @GentlyMad would kick me if i did. Must restrain myself….

====

22 November 2013 14:59

#AVtweeps how often do you change batteries at panel event? Wireless drop outs, understandable but thinking wired for question mics

====

22 November 2013 16:20

So at #BURCON with @GentlyMad watching video mistakes and listening to drop outs. Feel bad to be making light of other people’s problems…

====

22 November 2013 16:30

So this session is being brought to you  without a compressor or de-esser. Must resist the urge to go fix the mix… Hope not someone i know

====

22 November 2013 16:48

More guest audio in the stage monitor and perhaps less level in the house to make people talk louder #BURCON

====

22 November 2013 16:54

Current play back method is presenter holding speaker to microphone from  MP3 player. presenter had it, was planning to use. Line in please.

====

22 November 2013 19:29

#BurCon Day 1 is almost done, a karaoke dance party to go. I really hope they bring in an audio music system and not use the voice system.

====

22 November 2013 21:53

@cabbey yes #BURCON is using speech system for music/karaoke system. No subwoofer and no punch.

====

22 November 2013 22:01

Earplugs firmly in place. Much needed. Audio system: All CLIP all the time at #BURCON

====

22 November 2013 22:25

I really enjoy the 60Hz waterfall on the video as well. Man av at its finest

====

22 November 2013 23:09

If you can’t ride a fader to prevent feedback in the house system at #BURCON i can recommend some feedback suppressors.

====

22 November 2013 23:10

Hey #BURCON why have the stage lights so low? Photography is allowed why not allow the patrons to get good shots?

====

22 November 2013 23:11

Yes Snarky Mode is activated. @GentlyMad said it was allowed as long as i take pictures.

====

22 November 2013 23:20

Well clipping for hours has got to be good for drivers

====

22 November 2013 23:43

I know the purpose of reverb and autotune, it should be used on karaoke, unfortunately it is not being used at #BURCON. Ah ear plugs.

====

23 November 2013 10:27

Realy #BurCon the wireless mic died 10 minutes into the first session. Then feedback. Now people making fun of audio.

====

23 November 2013 10:39

You know the audio is a problem when @GentlyMad is looking at me knowing i want to fix it…..

====

23 November 2013 12:36

#AVtweeps just a friendly reminder, don’t skimp on audio monitors. Difficult to watch #BURCON talent comment on audio on stage.

====

23 November 2013 13:14

#BurCon audio hits continue. 60Hz hum is louder than talent. It just started…. Hmmmmmmmmmm @GentlyMad is amazed i haven’t clawed ears off.

====

23 November 2013 13:21

Now #BurCon talent needs to walk off stage to hand mic to audience questions. SPL keeps going up to point of ringing and slapback is louder.

====

23 November 2013 14:05

@brockmcginnis nope it is a live fan convention so it is the production staff. When talent makes fun of audio & video…. Well ……..

====

23 November 2013 14:27

@brockmcginnis yes i agree i should not slag people but the system they are using is showing its wrinkles and uncut rough edges.

====

23 November 2013 15:52

Best line at #burcon so far, something i swear @GentlyMad would say. “I just threw my microphone cozy at him.” By @dicksp8jr  (windscreen)

====

23 November 2013 16:42

there are these things called mute buttons on audio consoles. the team at #BURCON should use them as @GentlyMad is asking me questions…

====

23 November 2013 17:07

Literally the house left of the PA @ #burcon was no content just ground hum. Now feedback and ringing….  Sigh

====

23 November 2013 17:51

More audience in the monitors please #BurCon the talent can’t hear the questions. Sigh

====

23 November 2013 22:29

Tonight’s #BurCon question. Will @loudonswain have a PA or just the voice system. Any guesses

====

23 November 2013 22:49

For those of you scoring at home, and those that are alone, there is no Music PA. Just feedback, stage volume and voice PA for vocals.

====

23 November 2013 22:49

But i have @GentlyMad and a camera plus some cool @BorrowLenses glass so it is still all good.

====

23 November 2013 22:53

@mattcohen4real is doing a good job tweaking the speakers at #BurCon @GentlyMad says i can’t help. Really on both counts

====

24 November 2013 00:25

PA at #BurCon after party is much better than the main system. Amazing what happens when system matches  use. There is low end and headroom!

====

24 November 2013 14:37

No @feliciaday you are not deaf, the audio system @ #BurCon is not keeping up. I know it can be better….. Sigh

====

24 November 2013 19:53

#burcon really could use a high pass filter on the microphone. It is so boomy i am putting in ear plugs…. Things i do for @GentlyMad

====

24 November 2013 20:28

Appropriatte way to end #BurCon, dead microphone…. 2 minutes into panel

====

25 November 2013 18:57

@brockmcginnis @rAVeBlogSquad @stillbeingmolly i will be writing up a blog post about production at #BurCon & how it impacted event for all

====

Faithful followers may have noticed that recently there has been a large amount of NASA and space related content from me. Some people think that this is new, in reality it has been an interest of mine for most of my life. One of the earliest experiences I can recall is watching the launch of the American half of Apollo-Soyuz in July of 1975. It is so clear to me, I even remember the room I was in at home down to the orange carpet.

My grandfather worked as a civilian  at Fort Monmouth as part of the U.S. Army Electronics Command (CECOM), my understanding was in the Electronics Technology Laboratory. I remember us building models of TIROS satellites and talking about how relay communication worked when I was younger. He would share all sorts of stories and technology with me. I even had a jumpsuit like an astronaut that either my mother of grandmother made me. I even remember taking my first plane trip when I was six years old to visit my aunt, uncle, and cousin in the Washington DC area to go to the Smithsonian Air and Space Museum. From that same summer I remember the MacNeil/Lehrer Report about the Viking landings in 1976. One of the few toys/game I miss from my youth was a puzzle I got at the Smithsonian Air and Space that showed all sorts of facts about flight and space. I used to put that puzzle together all the time. It was not just to put the puzzle together and see the pictures, it had text on it so I could read about the X-15, the Wright brothers, and things in between.

In the early 1980’s I was fanatically following the Space Shuttle Project. I clipped articles and read magazines and covered my walls in articles and pictures. One of the cool things was I once again went and visited my family in the Washington DC area. My aunt worked in the Department of Education and luckily for me happened to be in a shared building with NASA. So that day not only did I get to go to the Smithsonian Air & Space Museum again, I also literally got to walk the halls of NASA. How cool was that. I remember looking at the pictures on the wall and being awe struck. The coolest thing was that my aunt found some posters for me of the Space Shuttle. I mean original when the external tank was white. I also did all the typical space fan things such as building model rockets purchasing telescopes and sky gazing. My grandfather coached me through some math puzzlement so I could learn more.

Then came high school, I can remember sitting in biology class when the Challenger explosion occurred. Literally down to which seat. It was a sad day.

Once NASA got flying again, college and girls started to fill my attention. I was still interested in technology I just did not have as much time. Unfortunately at that point space flight got to be “common place” and I was not following it as much day to day or project to project. Through my job as an audio integrator I got to work on projects at the KSC Visitor’s Center, including the Saturn V Experience. Since I was living in Florida I went to a few launches.

I still watched and read quite a bit about space and space flight, but more of history than current. I read and watched The Right Stuff, From the Earth to the Moon, Lost Moon … etc. So I never lost the interest, I just was not as actively studying it and following. Then a friend of mine got selected to take part in a NASA Tweetup Event for a launch. I started following space more and more. I subscribed to the NASA twitter streams. I started getting the daily news letter. I would stream NASA TV when hockey wasn’t on. It started creeping back in. Heck the last three books I have read are about space.

I kept entering for chances to take part in NASA Socials (the new name for tweetups as it is not just twitter). I kept on not getting selected, however I kept entering as hope spring eternal. Then came the one that I finally got picked for,”Celebrate Kennedy Space Center’s 50 Years of Human Spaceflight“. I would get to see the cool stuff you don’t get to see during the normal tours. I would get to go to the place where the program got off the ground. There was not a moment of hesitation I would be going, the hesitation was how would I pull it off.

The Lovely and Talented Wife (a.k.a. @GentlyMad) said she would help with the driving. Flying was cost prohibitive and driving approximately 20 hours each way on my own did not seem like a good idea.  But it was on, I told the boss I was taking three days off and off we went.

Now comes the back half, digging out from the 1,000+ pictures I took and trying to capture as much of the experience as I can in words. So in the new few days and weeks expect to see heavy amount of space content coming. I expect the trend to continue for a while. Actually to quote the L&T Wife, “I hope the space bug continues”. So if you will excuse me, I have to go pack for another business trip, upload some photos, review some photos, and watch Mars Curiosity landing coverage.


Originally posted: August 5, 2012

An Update: The Logitech G13 is no longer compatible with the latest Mac updates The replacement I am using is the Elegato Stream Deck, as it provides cross application features. I was considering a Razer Tartarus V2 as it is Mac Compatible

Bradford
October 4, 2020

Often times the controls for a piece of software are not the friendliest locations for one-handed operation. By one-handed operation I mean one hand on the keyboard, one hand on the mouse. When working in graphic programs I find myself working that way quite often. It could be as basic as a drawing program where I need to use the Z key to initiate the zoom function and then using the mouse to decide where to zoom. Other times it is more complex, such as selecting an image, zooming into a one pixel to one pixel rendering, panning, and then marking the image as a keeper or a chucker. It could just as likely be a drawing program where I am documenting an idea. For my #AVTweeps, just think  AutoCAD.

Recently I found myself being sore at the end of an image review session from unnatural movements. My data management workflow is outlined at previous blog post. However looking at the actual process I began to find lots of moving of the hands. My review process is based around the use of Adobe® Photoshop® Lightroom® (quite the mouthful so Lightroom for short). The program itself is very powerful and does help me manage my images, pictures, and photos. The program lacks some ergonomics for the one handed user.

The way I cull images is I go into the library mode and review the images at a resolution to fit onto the screen. I then quickly look at it and decided if it is a Pick, Unmarked, or a Reject. These selections are done using the P U and X keys. Notice how they are laid out on the keyboard.

Keyboard with PUX highlightied

Not very easy to navigate with one hand. Now let’s say I want to zoom into an area, one can either use the mouse to enter a 1:1 view or press shift and spacebar to enter the same mode, then use the mouse to zoom to areas. I do this to see how much aberration is viewable and if it is in focus, once again I decide if it is a pick, unfledged, or rejected. Lightroom has a setting to advance to the next image after assigning a value to the image.

That setting seems like it would save time, and it does quite often. However if I want to assign two things to an image, I have to back up to the image. If I find an image of the same subject later in the batch that is better than a pick I decided on, I go back to unmarked the previously picked image. So now I have a few options. I can expose the filmstrip at the bottom of the application window and click on it with the mouse and then press U. If this image was just the previous image I can use the arrow keys. If you notice both of these options require me to take my right hand off the mouse and place it on the right half of the keyboard. Now I could also just use my left hand on the right side of the keyboard however that still means changing positions.

Let’s say I want to see if a crop makes an image better. An example of a crop changing an image happened at the baseball game I took pictures at, since I was sitting in the stands some of the images have the back of people’s heads in them. Cropping the heads out made the pictures better, but some were still chuckers not keepers. In Lightroom I enter crop mode by pressing R, this would enter Develop module, where I would use the mouse to make the crop. I would then finish with the crop. I would then want to mark the image as a keeper or chucker. I cannot do that in the Develop mode, I have to be in Library mode. To return to Library mode I would either  take my right hand off the mouse to do the keyboard contortions or move the mouse away from the work area. Neither solution is very ergonomic.

There are keyboards available that are designed to fix some of these issues by changing the keyboard layout and having labels on the keyboard. However some are more expensive than the program itself. Also they are dedicated to the program, so I would still need my regular keyboard for such things as entering text. Not really an idea I was looking for.

I started thinking about it more and more and came up with a more practical solution in my not so humble opinion. I purchased a customizable gamer keypad, a Logitech G13 Programmable Gameboard with LCD Display as it is Mac compatible – yes it is also Windows compatible. (If you decide to buy one after reading my blog, using this link will give me a little commission.) This would let me decide how the keystrokes would be used. I could lay them out to my satisfaction.

I then determined what keys I used most. They are both left and right handed, and some of them require multiple hands, such as entering Library Mode (Command + Option + 1).

Commonly Used Keys on 110 Key Keyboard

These main keys were then assigned to the keypad as I found would work best for me. (Drop me a line if you would like to copy of the configuration file.)

Key Assignment for Gamer Keypad

I had 200 plus images from a business trip and figured that would be a great way to test it out. So I went through the images, did the rating, cropping, and keywording in about an hour including uploading to a SmugMug gallery. There was another benefit that occurred that was unexpected, I was able to hide all of the tool palettes in Lightroom so the images were bigger on the screen during the review, remember bigger is better. I do not have exact times for similar tasks using the “standard” keyboard commands but the important thing is I was not sore and it was not as tiring to me.

The keypad allowed the thing that I think all tools should do, get out of the way and let me work. It did just that. Other than when I had to type in keywords, I used just the keypad and the mouse. I did not have to move my hands around the keyboard and mouse.

I also learned a couple more tricks in the process. I can use the keypad in more than one program, but keep the key functions the same. By key function I mean that the same key that sends an R to enter Crop mode in Lightroom can be configured to send a K in Photoshop or Command + K in Preview to perform the crop functions. The same key press to me, sends different keystrokes to the application. Much easier than having to remember all the different commands, similar to Cut, Copy, and Paste being the same in almost every program. That is a fine example of what I was trying to accomplish; cut (Command + X) copy (Command + C) and paste (Command + V) are not great mnemonic devices at first blush but the arrangement of the keys makes it very easy to use.

As things are becoming more and more automated, I feel that the understanding of the process is being lost. I believe that tools should make my life easier and allow me to spend my time doing other things. However there is a downside, does one always understand the automation that is being accomplished? While these can be great timesavers, what happens when it doesn’t work or you don’t like the results? Understanding the process that the automation process is simplifying is key.

A common example is defining an IP network. Most people simply connect to a network and let a Dynamic Host Configuration Protocol (DHCP) server assign the address. This happens at the office, the home, the coffee shop, pretty much everywhere. When it doesn’t work for whatever reason understanding where to start troubleshooting is a mystery to some. I use DHCP quite a bit; I also do know how to do the entire process manually. I can manually – not that I want to – calculate the subnet network and assign the addresses. When there is no DHCP, I am still able to get connected. If I am still unable to get connected, I am able to call tech support and describe the problem effectively.

While IP networking is a common example it occurs with other technologies as well. I do have an interest in photography and have been doing more processing on images. For some of the process I do it manually, for others I do use automation tool. An example of this process is this picture of Martin Brodeur I took.

Straight out of camera, no processing

I took the shot in a manual mode, shutter priority, I also told the camera where to focus to get Brodeur in focus and the background blurry. I could have accomplished a very similar effect using the Portrait Mode preset in the camera, but I wanted to control the look of the picture. After I took the picture I did some work on it in Lightroom, and Nik Software. In the process I adjusted for the lens, applied a vignette, applied noise reduction, and converted it to black and white. This process was a mix of manual and automated. I could have just clicked a few buttons and called it done. Instead I made decisions along the way, and I understood the impact of those decisions. I was able to decide the final mood of the image as a result.

Processed picture, click to see entire gallery

This result is much better because I controlled the process and got the result I wanted. Did using the automation for part of it save time? Yes it did save time. Since I had taken the time to learn about the conversion process http://www.dgrin.com/showthread.php?t=114917 I was able to understand the questions and obtain the result I wanted. Now if you will excuse me, I need to troubleshoot my network as the Wii is not connecting to the Internet.

Recently I ran across this story http://thestolenscream.com/ about a picture that was taken from a photographer’s Flickr site and was being used around the world. He was not being compensated. It is both an amazing story of how something can go around the world from just being good and how at times people’s work is stolen. The video is 10 minutes long and is well done. The back story and video link is available here at http://fstoppers.com/fstoppers-original-the-stolen-scream/

Notice what I have done above, I clearly indicated where the information is located. I could have just as easily gone into YouTube and gotten an embed link to put into my blog. I also could have just as easily downloaded the video and edited out the credits. But that is an insult to the people who created it. I am basically stealing their time and effort.

I know that some of my readers are more familiar with audio video system integration than with photography. The same thing occurs there and other places as well. It might not be a picture it could be a grounding scheme or a user interface panel just for a sample. Perhaps it is finding information on a manufacturer’s website and including it in your information package. Often manufacturers are okay with that, if you are using the information to sell and use their products. However that does not always happen.

Last year I was very surprised when someone called me to complain about a training video I did that was on YouTube. I was not surprised that I got a complaint, rather I was surprised that it was on YouTube. I did not upload the video there. I uploaded it to my work website. Not a huge deal as it was information about our products, however it then started to sink in. This website had taken someone else’s work, made some edits, and were then presenting it as their own work. They even placed their company logo over the video as well.

Someone else was supplicating all of the time and effort placed into the video. I understand how anything on the Internet is capable of being copied. Basically that was what annoyed me the most was that the effort put forth to collect and present the information was not being recognized someone else was just taking it.

That seems small, no one harmed, right? That is somewhat correct. My company paid for me to make the video and the product was still being promoted. However what happened if it was not a sales tool but rather a picture of a landmark, a presentation about a topic, a system design, or a configuration file for a piece of equipment.

The information is being provided without compensation to the creator or even acknowledgment. Basically that person’s time, effort, and knowledge is being stolen. If it is licensed under Creative Commons terms the creator expects certain respect in the process. If it is not expressly stated that it is okay to use, it should not be used.

The best example is someone who is creating a presentation or proposal and need a picture of a movie theater. I found a nice theater image on Wikipedia taken by Fernando de Sousa from Melbourne, Australia and licensed under Creative Commons Attribution-Share Alike 2.0 Generic license. That license requires attribution. Mr. de Sousa is a professional photographer. He takes pictures for compensation. He shared his work, the results of his skill, equipment, experience, and knowledge. All that he asks for is credit. Will you provide it?

Think about it another way. You went through the process of creating a proposal for a project. You outlined the equipment and process you are going to use. You provided information about why you chose that approach. The person you made the proposal to decides not to hire you. Instead they take your proposal package and use it to create the project themselves. Would that annoy you? Would you expect compensation? How about if all you asked for was attribution?

So I ask everyone to please respect the Intellectual Property, time, effort, and knowledge that is provided on the Internet and provide attribution at least. Don’t take credit for other people’s work.

I am off to go place watermarks on my stuff, if you would like to use an image without it, just ask.

Also known as “The Disconnected Challenge” or “Offline Challenge”. It has become more of an issue since everything has gone to the “Cloud”. What happens when one cannot connect is something to be considered

Bradford
October 4, 2020

Another blog post written at 32,000 feet as that is when the issue hit me. I have various electronic devices as my dedicated reader knows. I have previously talked about various data access connection challenges. This new challenge is not one of my own doing. It is a poor user experience or use case definition. This problem was illustrated by Amazon and their Kindle applications, but it does not apply to just them. This challenge happens to many applications beyond this example.

I have found a time where the electronic delivery of a book advantages outstrip the disadvantages I previously outlined. This happened with a “for Dummies” book. At work, I am on a software implementation team rolling out a new application package. I wanted the “for Dummies” book for the application. I looked at Amazon and the book was available both in paperback and in Kindle form. The Kindle form was substatianaly less expensive, but the key item was I could get literally instant delivery. While on a conference call I was able to purchase the book, take delivery of it, and reference it during the call. It was very powerful and better than using the Internet search tools as it has high signal to noise and no rabbit trails.

The next day I had a business trip, I had my analog reading material and my electronic versions. On the plane flight I started to truly ready my newly purchased book. It was also the first time I had started to explore some of the Kindle application features. I saw that there were sections of the book that were underlined. Not underlined texted, but a dashed underline. I was not sure what it was at first, but I found out that it meant that other readers had highlighted that passage. The idea of crowd sourced highlighting was intriguing for me; it helps to know what areas one should pay attention to.

I wanted to see what other features were available. My brain needed a little break from thinking about business practices. I was going to use that time to browse through the help file and see what other features were available that I might not be using in the Kindle application. I was airborne when I wanted to do that. I had no Internet access on that flight. As a result of not being connected to the Internet the help file was not available.

That seems very counterintuitive, why would an electronic reading application not include a help file with it? Think about that for a moment. Something that is designed to read document while disconnected from the Intenet is not able to read its own help file while not connected. It is not just Kindle that has this design flaw. Cloudreader, Nook, and iBooks for iPad do not have a help file that is readily available. I am sure that I can continue to list others as well. It also occurs with applications for workstations.

Not all applications are that short sighted. Two applications on my iPad have help that is available offline. iAnnotate and DocsToGo install their help file as a document you can read from within the applications.

Makes perfect sense to me. An application that is designed to be portable, should have supporting documentation that is portable. So for those of you involved in the design and creation of applications, think about the user that is not connected to the Internet. They might want to refer to the supporting documents; you should make it easy for them. The fact that I turned to the help file already means that the application is not intuitive enough. Do not compound the issue by making it difficult to find the help.

Also this concept applies to those of you who are creating custom control interfaces using software created by others. On more occasions than I would care to count I have ended up troubleshooting a control system and having to guess. These guesses could range from what are the IP addresses to connect to the system to what the control system is using for the backend to how to get help.

For the application users, I recommend that you try out your applications before you are traveling with them or disconnected from the Internet to make sure you understand how to use it. The help files might not always be available.

Well the fasten seatbelt sign just came on….

<note this post was recreated after a website crash, good thing I backed it up>

Since writing this post in 2010, I have gone awy from JungleDisk. I found that it was using up too many clock cycles in the background. I am now using AWS and ChronoSync.
Bradford
October 4, 2020

I have found a few things out over the past few weeks that I figure I will share with you my faithful reader. I have had a logic controller failure on my MacBook Pro which meant that I was sans laptop for approximately 10 days. The day after I received it, less than 12 hours later, the cable modem at my house failed.

So between not having my personal laptop and then Internet access being a car ride away, I discovered some items along the way.

  • Backing up Data is important, but one also needs access to the data

There are a few other tangential things I have found out as well, such as changes to my photography workflow, online instructions should not be the only instruction, unfettered Internet access can be a key item but those will be separate posts.

Using my backup solutions none of my data was in jeopardy, however using that data was the challenge. I have been using JungleDisk as my incremental off site backup solution. It works very well for me, but has some choices along with it that I was not fully aware of when I made them. Using a block copy approach I could reduce the amount of bandwidth and storage space I use, however this does not come without its tradeoffs. By making this choice I would be unable to browse the files online, I would have to actually restore them using the client software. At the time I did not think that it was a big deal as I figured I could always just install the client on another computer and get all the data back.

A key item here is that it is my off site backup. Too many people think that just having a backup is sufficient. It is not as there are other things to consider than just a hard drive or computer failure. One has to think of other ways that Data can be destroyed: “Someone stole my car! There was an earthquake! A terrible flood! Locusts!!“ Having the data off site makes it much less likely that Data will be lost.

I could have just installed the client on another computer and get all the data back that still was not going to solve all my issues. As a result of not being able to browse the contents, I am going to change my approach yet again.

Some items will be backed up using block copy, other items will be backed up using file copy, and still other items will be backed up to either Mobile Me’s iDisk or to my Dropbox account. You might wonder what data would go to what place and how to keep it all organized, well that is actually fairly easy as long as I make the right decisions when starting. Just by putting files into different locations on my computer they will be backed up in different ways. Placing items into the Documents directory will place them on JungleDisk, placing items in the Dropbox folder will be on Dropbox obviously (still waiting for selective sync before 100% happy with it), and items stored in iDisk will be on MobileMe iDisk.

The key to this approach is to make sure that a file is stored in one location and only one location for live Data. I have often encountered problems where two files have the same name, but different time stamps or on different computers, so how do I know which one is current. Since all of these items are backed up to the “cloud” of the Internet I do not have to worry greatly about the loss of data. I still do backups to DVD and secondary hard drives every so often so that I am not completely at risk. For items that I want to make sure I backup in more than one location, well I have not hit any yet, but using ChronoSync to keep a “Backup” directory in sync is my plan. This will allow me to create a directory in one of the other storage locations that is labeled KeyJDBU (Key JungleDisk Backup items), then I can use ChronoSync to decide what to copy into it and keep in sync.

This approach of also having the key items in iDisk or Dropbox will also allow for the items to be browsable without having to restore all the data. It still does not solve another key issue, do I have the access to the programs to use the data once restored? I found that quite often the answer was no. Most of this situation was my own fault as I chose what format to store the Data in. Once again I could reinstall and have the data back, but that would take a while; especially with the licensing headaches some companies have put in place (that means you Adobe). I am now considering how to handle that issue.

A few days ago I posted a Tweet that said, “signal to noise is important, not just in audio but in life”. That post was an amalgam of someone’s tweet commenting on the palaver at their job result of the amount of Tweets I was getting from one stream. I realize that the single stream is not an indictment of all who Twitter, Twitterers?

I figured I would post here what I learned from a quick study over the past week.  I am following 40 streams, 32 posted something in the past week, there were a total of 522 tweets, or an average of 16 tweets over the past week. However there was one person who posted 208 Tweets in one week, the vast majority of which were very repeative and redundant. Since a picture is worth a thousand words, how much is a graph worth?

40% From one stream
40% From one stream

In addition the person also put down an identifier so that they would trend and are getting much of the information from AlertDeck. So that person is not being followed now. The disappointing part is that they actually have something valuable to say; they have just started adding to much noise in trying to market themselves.

My warning is that marketing via Twitter can be done, but if there is no content everything gets turned off. Stay tuned… I might decide to reveal who the offender is.

Oh yeah, I have also decided that Apple’s iWork’s Numbers ’08 is not very powerful when it comes to collating data as I still had to do much manually instead of just doing a Pivot Table in Excel. I also still can’t activate half my applications…

I have gotten a Playstation3, I also have PS2, PS1, Nintendo Wii, Nintendo GameBoy, Windows, and Mac. So to so I have various gaming environments. So there have been some games that are available on multiple platforms and I have had a chance to try a few of them. On the PS2 I enjoy playing the SSX series of games. It is an EA Sports game and fairly fun. So when I saw it available at a reasonable price for the Nintendo Wii I figured I would try SSX Blur.

I was surprised at how different it was compared to the other versions. I realize that part of it is the change in the controller interface. The programmers I think were trying to use the advanced control options of having the accelerometers control the trick interface. For instance rather than use the controller as an analog for the board alignment, rather one has to shake the controller in a pattern to pull a trick. However it is not intuitive.To do a trick, one draws a heart with both controllers; there is one for each hand.A completely different experience than pressing square while using the D-Pad.

There are times that the accelerometers do work wonders, most often when used as an analog for another control. For example I downloaded a demo for the Playstation3 of a golf game. It was abysmal compared to the Wii Sports Golf where one uses the Wii Controller as a golf club. So it proves that it is not the controller that is flawed, but rather the application of the interface and technology.

I also downloaded a demonstration version of Civilization for the PS3. I really enjoy playing it on the computers (both Mac and PC), so I figured it could be cool on the PS3. However I was surprised at how different the experience was between using the computer screen and mouse and using the video game interface and the screen resolution. It was just not as familiar and intuitive to me. Perhaps it was the fact that I am used to something else.

I think it just goes to prove that the interface has to be adjusted to the environment that it is being used within. So the interface for a video game that one has to look at a large area, such as Civilization, having more control over the view is the key. The use of a controller for a sports game, the control use should be analogous to the way the object on screen is moving. This has not always been the case, such as why does pressing the “X” button cause the object to jump, that dissociation is easier to compensate for than drawing a heart in space makes the object flip upside down.

After all that, all that I have to say is don’t assume that the experience will always be the same as the human interface changes.