Month: October 2013
The world is falling down with books, methodologies, techniques, tips and tricks on how to be more efficient in life, work, and everything else you can think of. What isn’t often discussed is how the desktops, both real and virtual can be tailored to act as a strong foundation, and take things to the next level.
I’ve been thinking about this ever since I tidied my (physical) desktop a couple of weeks ago. Wow.. what a difference. Not my strength keeping that tidy, but it got me thinking about my virtual desktop too. As a Ubuntu User with multiple virtual desktops available to me, I’ve always had a strong sense of standard placement of specific applications but after some thought, have taken that to the next level.
The Physical Desktop
I love my latpop, and there are times when I get in the habit of sitting in front of the fire for a few days with it on my lap, but nothing beats the productivity benefits of a desk, a second monitor, a real keyboard and mouse, kick ass sound, and somewhere to set your coffee.
I’m not really going to say much more about it than that – the pictures says 1000 words.
The Virtual Desktop(s)
Now things get interesting. If you are a Windows user then unless things have changed since I last braved using one, you are SOL when it comes to virtual desktops. For Mac and Linux uses, multiple virtual desktops are things that we’ve been using (or haven’t bothered to use) for years.
Anyhoo. Ubuntu and Mac users can have as many virtual desktops as they like. Not sure about Mac, but with Ubuntu you can configure how they are arranged, and maybe because of my old Cube days, I like 4 virtual desktops side by side, and configured so you when you get to one end you wrap straight around to the other.
That means I have four full desktops that I can flick back and forward between, and each has 2 monitors.. 8 distinct areas.
So the secret to making this a haven of joy and efficiency is always keeping things in the same place relative to each other. For example, say all you use is a browser and Word Processor all day long as part of your core job, then an email client, and a music player.
You might do something like this:
Virtual Desktop 1
Laptop: Music Player
Virtual Desktop 2
Monitor: Word Processor
Why is that good? If you are working on a document, and you need to send an email you know that the email client is just over there on the Virtual Desktop to the right. Not a great example, but what happens when things get a bit more complex. I’m a rails developer, and at the very least that involves:
- a browser
- a terminal running the rails server (and putting out useful information)
- an editor
- a terminal running a rails console (used constantly while writing code)
- often a MySQL GUI
As if this isn’t enough, in development one is typically working with many open files at the same time, so multiple tabs within the editor. It’s this last part that made me really rethink my old strategy and for the first time move editing and the browser to different virtual desktops.. and try something that has turned out to be incredibly valuable.
I now have 4 different editors open, with the folder tree in each open to a specific folder of a rails project.
- Project Root (for other.. migrations, configs, helpers, css / js)
Each of these editors
- is not full screen
- has a corner visible no matter what (so you can get to it with a click)
- is always in the same on the screen (I go M/V/C/Other clockwise starting top corner).
It’s all kinds of awesome. The result of this separation of browser and editor left some really great gaps for other things. Here’s a rundown of my 4 Virtual Desktops going left to right.
I have to say after running with this for a couple of weeks, I can’t imagine going back. My fingers have absolutely learned things like CTRL-S, CTRL-ALT-LEFT,F5 (Save code, move left one desktop, refresh browser) and having the Rails server beside the browser and the rails server beside the editor makes so much sense.
Give it a go!
Understanding Proxy Servers. What are they for? Why would you use one?
Proxy servers are really used for one of two things:
1) A stepping stone.
Think of a Proxy Server as a step between you and where you want to go. For example, you can configure a proxy server in most browser configurations. The result is something like this:
– you want to view google.com
– your browser sends the request to the proxy server
– the proxy server pulls up google.com
– the proxy server tunnels whatever it gets back to you.
– ANONYMITY. Google.com doesn’t see your address in the request. It just sees the Proxy Server. Of course there is no real anonymity on the web – but we are talking about what someone sys admin at google can tell about the traffic.
– SPOOFING LOCATION. If you are in the US, and there’s a website in the UK that is locked down to only allow people in the UK to access content, then by using a UK Proxy Server, that website thinks you are in the UK and you are good to go.
2) A Gateway
This is frequently more of a corporate thing, where a company comes up with various reasons why they need a Proxy Server, but in general it’s so they can control the crap out of you. If you work for a big company then while on the internal network you might only be able to surf the internet through corporate Proxy Server. There are legit and useful reasons to do things this way, but more often than not, a biggie is that with all traffic tunneling through a single point it’s much easier for the IT dept to block access to certain sites or make sure you aren’t watching donkey porn at work.
For this post, we are more interested in (1), the stepping stone, and what options you have, and that really depends on what you are doing.
TYPE A: FREE (or pay once) PROXY SERVICES
If you are on vacation in Mexico and you want to use http://www.hulu.com, then the best thing to do is go to somewhere like http://hidemyass.com/proxy-list which publishes lists of Open Proxy Servers. The more recent the listing, the more likely it’s going to work.
Results will vary. A proxy might work, or it might not, and it might be horribly slow. For sure, a proxy that works today probably won’t work tomorrow. You also have no idea WHY this proxy is available, and it’s definitely a possibility that it’s sniffing whatever traffic you are putting through it, so not the best time to Instant Message your credit card details to someone.
If these services have a Premium offering, the basic message is the same – you just get the the list delivered to you in a better format.
TYPE B: PAID PROXY SERVICES
There are a bunch out there, and if you are doing some sort of web scraping or web spidering or similar where reliability and performance is important to you, there is no other way to go. PAID proxies typically ensure that they can keep up with needs by charging by throughput . That makes them a BAD idea for your Vacation Property in Mexico for Hulu / Netflix because media bandwidth adds up quickly.
Here are some of the things that differentiate offerings.
1) Revolving IPs
This is a good thing. Basically this means that you always call the same proxy server, but every time that proxy server makes an outbound call it cycles round-robin through a bunch of IP addresses.. so if there are 10 revolving IP addresses and you hit a page 10 times, then in theory each time is with a different IP.
Typically the IP addresses are sequential, so it’s not exactly rocket science for someone on the receiving end to see what is going on, but it’s still a bonus.
2) Multiple Proxy Servers
A decent service will give you credentials for a number of proxy servers, possibly in different countries. That means that you can round-robin call each of them from your application to further tangle the path between you and your final destination (See this post for an example of how you can easily do this with Ruby).
3) Short Term IP use
Not only do decent services round-robin between different IP addresses, they often throw those IP addresses away periodically and start with new ones. The advantage here is that if someone sees incoming traffic and blocks an IP, then that block is only good until the Proxy Service throws that IP away.
A really good PAID proxy service. ProxyMesh.com
It’s been about 6 months since I really dug in, but I really like ProxyMesh.com
– Their prices are reasonable starting at $1 / gig.
– They have highly maintained US and UK proxies with revolving Short Term IPs
– They also manage a list of Open Proxie servers (like the ones listed at hidemyass.com) but THEY manage the list on their end. You just call the same ProxyMesh Proxy, and they farm out the request to any one of hundreds of proxy servers – and that list is very fluid as proxy servers come and go worldwide. These proxy servers are of course much less reliable than their core service, but offer significantly more anonymity.
That’s all folks.
Here’s the problem in a nutshell. People like me who drooled over XBMC for years did so because it did an incredible job of dealing with local media files. We then got really excited because XBMC did an adequate job of dealing with streaming services that started to creep into our lives, like Hulu, Spotify, Amazon Video etc.
That was a while ago – but here we are now, and the world has changed. If you are anything like me, you’ve gone through this evolution:
1) EVERYTHING was local media, acqured by whatever means necessary.
2) Streaming services crept into my life, but still – 90% local, 10% streaming
3) Today.. > 50% streaming, < 50% local.
That’s BAD for XBMC.
It’s not XBMC’s fault.. it’s a great product, created and maintained by a great bunch of open source guys, but the real problem is the plugins. Hulu could give two shizzles about the XBMC market.. likewise for all the rest. They just don’t care.. and that means that the plugin creators have an uphill battle.
It’s one thing to be a part of that battle when the plugins account for 10% of viewing pleasure, but when it’s more than 50% (for me 90!) then it gets really painful. Put that a different way, 3 years ago, XBMC did 90% of what I wanted really well – and I had to fight with plugins for the other 10%. Now It does 10% well – and I have to fight with plugins for the 90%.
MAKES NO SENSE ANYMORE!
For me, the nail in XBMC coffin was a little bundle of joy called Plex. Plex server in it’s current incarnation is all kinds of sexy. It runs on pretty much anything (and for me that’s the only remaining linux Desktop in the house) and merrily does things very very well. In a way it’s XBMC center without the center. It scans specific folders for specific media, works out what it is, adds cover art, and then joy of all joys, makes that content available through a browser.. and.. to Plex Clients.
What Plex clients are there I here you ask? Well that’s where things get really nice..
– Others (including iProducts)
I’m just going to leave the “others” up to you, because as far as I’m concerned that’s already mindblowing.
Let’s switch gears for a sec and talk about Roku. Roku IMHO is the Apple TV for non fanboys. I didn’t realize how astonishingly awesome it was until the Plex server features made stop and say, “Hey.. wait a minute” and take off my XBMC tunnel vision goggles.
Roku Rocks. The Roku 3 is a powerful little beast in the palm of your hand, with a great interface and a nifty little idea – headphone socket on the remote. Love it. But can it really make all of these big PCs go away that I have scattered around the house connected to TVs?
Hell yes.. because of Plex. Thanks to the Plex plugin for Roku, that shrinking yet vital media set that one acquires through various means (because it’s not yet on Hulu, Amazon or Netflix) is still right there in a beautiful package, as is your entire music and photo collection.
Nuts and bolts.. here’s the choice:
– great support for local media
– buggy support for Hulu
– buggy support for Amazon
– ugly support for Spotify
– buggy smartphone remotes
– various fun things
— great people out there working hard to provide plugins in an uphill battle against service providors e.g. PBS
– beautiful support for local media through Plex
– flawless support for Hulu
– flawless support for Amazon
– flawless support for Netflix
– flawless support for Spotify
– excelent Roku smartphone app
– adequate stream from smartphone apps
— service providors falling over themselves trying to work with Roku to add channels e.g. PBS.
I’m sad to say that for me XBMC is dead. I’ve gone from 2 hefty XBMC to two Roku 3s I picked up for less than the price of the PC video cards running XBMC and I couldn’t be happier.
At home I have various devices doing various things.. and it’s important to me that I know they are working. There are many tools out there designed to keep an eye on server health, but they don’t do everything that I want in the way that I want, and I’m a big believer that coding = creating, so there’s nothing wrong with reinventing the wheel just for the sake of it.
For me it started recently when I repurposed an old android phone as an IP webcam pointing out the front window. After that it was a natural progression to install the excelent linux app Motion to pick up movement from that feed, and store images and mpegs to disk – for security. Well no point in storing the images on the nice machine that is likely to be stolen in event of a break in.. so let’s store them on an old EEE-Box that’s running headless in a hidden corner of the basement 😉
Well there you have it. Definite needs for a frequent health check:
– is the camera working?
– is the EEE nfs mount mounted?
I wrote a ruby script and cron to solve that problem, but then it just grew. Now I have the system running on multiple machines:
– checking each other
– making sure all my websites are responding to ping
– validating that certain URLs are giving a 200 response
– making sure that disks aren’t filling up
– checking that certain processes / background applications are running
On top of that:
– my laptop knows if I’m at home or away and tests accordingly
– the system creates Desktop icons for each problem (and removes them when problems resolve)
– it can generate Ubuntu desktop notifications
– it can notify me of issues with a text message
– each machine manages hostname|issue style files on Dropbox so whereever I am my laptop can tell me what’s going on at home.
– it shuts up at night
– for every problem it finds, it checks to see if there are instructions to try to correct it
All this is achieved with:
– 200 lines of core ruby methods that perform all the tests
– 50 lines of code relating to specific issue resolution
– 100 lines of control code – one liners which are basically “If you are this machine, then do this test”
In Part II we’ll take the very basics of that code, and create a simple single ruby script to keep an eye on a machines own health.
Dropbox and Google Drive (with Insync) – let’s just call them “Netdrives” – are great ways to sync files across multiple linux (Ubuntu) machines.
- Utterly reliable sync
- Online login to get your files from anywhere
- Great mobile support
- Can be used headless on linux servers
- Cross platform (Mac, PC, Linux)
When it comes to using them as a repo for scripts though they have inherant issues. They don’t retain file attributes.. so you create a file on one machine, chmod +x and it gets synced to all other machines, but without the +x.
- Ubuntu One.. but every time I try it I walk away thinking it’s not really an option at all. Why can’t Ubuntu get this right!
- NFS mounts. A good option if everythings on the same Lan, but what if it’s not?
- sshfs mounts. I still love sshfs mounts for certain things, but not when I want persistency.
- Github.. a bit clunky.
So, here’s what I do to solve this issue. A BETTER way (and I’ll blog that soon) is to use incrond inotify, but here’s a quick and easy way that works well and is really painless.
1) Put all the files in the same place.
I use /Dropbox/config/bin for Bash files
2) Create two files that do the “work” for you.
You need two files. One of which you actually have to chmod +x yourself at some point, but it’s not a file that will EVER change.
I use /Dropbox/config/bin/worker as a file that has the job of making sure all other files in this tree are executable
chmod -R +x $HOME/Dropbox/config/bin
Note that it launches the file worker2
Here’s MY worker2
It just launches something that I want to run often. The reason for worker2 is that you can modify worker2 from any machine, and worker will make it executable just before executing it.
3) Use Cron to run worker as often as you like.
* * * * * export DISPLAY=:0.0 && export XAUTHORITY=/home/keith/.Xauthority && sudo -u keith /home/keith/Dropbox/config/bin/worker
I run it every minute.. because I want that pulse_check (makes sure all my stuff is healthy) to run every minute anyway. Note the overly complex way that this is called in cron. All that XAUTHORITY stuff on a Ubuntu box means that if notify-send (create desktop notifications) can be used in anything downstream launched by worker, and they show up on the desktop.
That’s it! It’s not rocket science, and in a way it’s a bit ghetto – but it works well.
I have no idea how often people ACTUALLY look at their logs looking for someone scraping their pages, but sometimes you want to just fly under the radar. I generally don’t agree with stealing web content by scraping, but I do believe that if someone is in the data distribution business, but they suck at it, it’s ok to bend the rules a little. For example, if they offer an RSS feed that is buggy, slow, huge etc. but their homepage offers the same information more reliably – go for it.
There are basically two things at play here:
- Spoofing a user-agent (pretending to be a browser not a script)
- Spoofing the source of the request.
Here’s a little function you can call to get a random user agent, based on a list of really common user agents.. Thanks to the guy who posted this to a blog, sorry, I don’t have the reference anymore.
I usually put such things in a model utility.rb so I can just call it with Utility.random_desktop_agent
That takes care of user agent, now on to proxy. You’re going to go out and get your own list of proxies.. whether you get some reliable free ones or pay for services. The best services you call a single proxy of theirs, and it will then cycle through a bunch of IP addresses with the call. Each call, through a different one round robin. They then dump those IPs every 30 minutes or so.. not bad.
Again – I put that in the Utility.rb model.
So put it all together..
Calling a page looks something like this with open-uri
That’s about it.