Archive for December, 2008
I’ve been refreshing my skills on the lightweight Java stack (i.e. Spring/Hibernate/JUnit/JMock/AspectJ) recently, which has been really fun.
It’s been a while since I built something significant in Java. 95% of the work I’ve been doing at Highwinds has been straight Flex, and during my management stint at CFI before I came to Highwinds I spent most of my time directing/architecting development efforts as opposed to getting my hands dirty. As a result, I left off from my last coding project with Spring just as 2.0 was coming out, which meant I missed out on all the bad-ass annotation support and could only look over developer’s shoulders to see it in action.
Of course, nothing has changed in principle – the implementation just got simpler and better. I’ve spent the last week picking up the “how” of the Spring aspect-oriented support for… well, everything, and learning JMock.
I needed a quick reference to get moving, so to supplement the metric assload of online materials I picked up a copy of Spring Recipes by Gary Mak. It’s an awesome quick reference/cookbook and was exactly what I needed to get going. After all, I knew what the framework could do, just needed something to boil down the config for me so I could put the principles to use.
So, here I am a week later with an almost-finished Java service app. Of all the features, I think the JMock and JUnit stuff is my favorite. There’s nothing cooler than being able to test your stack on the way down as you code it without having even finished the components at the lower layers – love it. Annotation-based validation in OVal comes in second, and Hibernate config-by-domain-object-annotation comes in third.
LeGros was also showing me Grails this week, so that might be what I turn my attention to next…
As Brian was wrangling the AIRRunner in to shape to support our continuous integration environment, he ended up cleaning up the code for the AIR support significantly. The result is a test runner that produces Surefire-compatible output and counts errors properly, plus a bunch of other stuff that Brian tallies up in his post. Turns out the Fluint team liked Brian’s changes enough to make him a part of the project.
Now, if Adobe would just hurry up and find a way to allow Flash and AIR apps to run headlessly, and if I could finish my bytecode weaving library for ActionScript 3 so we can have proper mocks, we might be able to approach real testing capabilities for the Flash/Flex in the next year or so.
Don’t hold your breath… :)
I believe that when naming classes and interfaces in a system, they should be named after their purpose – not their implementation. This is purely a stylistic choice, but I’m always interested to see who does what. I’ve seen both practices employed, and while neither is “better”, I have my preference.
When I started programming, I used to name things “Abstractxxx” and “Ixxx” to indicate that they were abstract classes or interfaces. I quickly found this was both redundant and could lead to confusion. It’s redundant because languages that support the features of abstraction and interfaces have keywords that describe them, and most can generate documentation that logically separates interfaces from classes. It can lead to confusion because if you decide to refactor an object (such as extracting an interface at a later time or making something abstract concrete), you not only have to change the implementation, but the name as well. People familiar with a codebase then have to relearn the new names due to the refactoring change.
Of course, modern IDEs handle refactoring very gracefully (and might even find all your name references in non-code resources such as config files), but that’s not really the point. The point is: choose your implementation and your names separately. That way, you can refactor whenever you feel like it and the purpose of the object is captured in its name without any unnecessary hints in to the implementation.
I got a new toy recently. To paraphrase Lester Burnham from American Beauty: a 2006 Access Virus TI Polar. The synth I’ve always wanted and now I have it. I RULE! :)
Some People Who Know What They’re Doing
If you can’t be bothered to read my audio-nerd review, here’s a few links to videos of the synth in action, and editorial reviews that you might find interesting if you are thinking about buying a Virus TI. Feel free to post questions here as well and I will do my best to answer them as my level of expertise matures.
The Atomizer demos are my favorite. It’s like instant BT-stutter effect.
- Marc Schlaile demoing Atomizer (the best Atomizer demo I found so far)
- Rich Devine demoing the Atomizer plugin
- Nutty D&B Atomizer-plus-old skool-Roland TR-606 demo
- In-depth demo of Virus full-size keyboard with Ben Crossland from Access at NAMM 2005 (choose your player from the links at the top)
- People dicking around with a Virus TI Polar at Remix Hotel 2008 in Miami, FL
A Little History
Back in 1997, Access came out with the Virus A desktop, and some years later the very cool Indigo series of keyboards. They were renowned for their awesome synth engines and solid build quality, but like most European synths they were too pricey for the likes of me.
Then they went and upped the ante with the TI series, which stands for “Total Integration.” Since Access manufactured both PCI audio hardware acceleration cards paired with software synths for PC musicians (the Powercore series) and hardware synths, they decided that they might as well combine the two and make keyboard synths that integrate directly in to software. This was a brilliant move, since anybody who has messed with getting hardware synths to play with computer sequencing packages knows how much of a pain it can be to deal with MIDI and latency issues. Of course, these models weren’t any cheaper than their pure-hardware predecessors. It seemed like I would never own one; I more or less forgot about them.
Then last weekend, after bouncing around some old music sites, I came across some articles on the TI series and decided to jump on eBay to see if any were on the market. As it turned out, there were two nearly-new Virus keyboard for sale at pretty reasonable markdowns. I decided to put a bid on one of them, and my top limit matched the reserve. I hit up the seller directly to see if he wanted to wrap up the sale that weekend, and within 15 minutes of making my bid we were on the phone. He agreed to drop his “buy it now” price to the reserve, and it showed up on my doorstep yesterday afternoon.
This is to date the most expensive piece of audio kit I have purchased, which is surprising in the sense that I never even fiddled with one in the store (they are pretty hard to find). Most of my prior purchases have involved several trips to a local music shop and much obsessing before pulling the trigger. However, things have changed a lot in the years since I originally loaded up my studio, and with so many resources at my disposal on YouTube, SonicState, and musicians forums, I got a great feel for what I was getting in to. But even with so much info at my fingertips prior to the acquisition, I’ve been blown away by this thing after only a few hours of use.
I’ve only had my Virus a little while, so there’s only so much I can report on, but I intend to build upon this over time as I work my way through the manuals.
While you need a computer to take advantage of the full TI capabilities, you can also just turn the thing on, plug in a pair of headphones, and get cracking without hooking it up to a computer. As a standalone hardware synth, the quality of the audio engine is above and beyond any hardware or software synthesizers I have ever seen before, including the Nord Lead and Alesis Andromeda that I had coveted in the past. The filters on the Virus are amazingly rich, and the combination of access to almost every parameter from the control panel combined with a very intuitive menu system allows for a massive variety of sounds.
Out of the box, you get over 2,000 high-quality sounds distributed across a number of ROM and RAM banks. Within the first few patches I was starting to get ideas for tracks, since the quality is very high and the easy tweakability gives you a lot of creative expression. I also love the three-octave keyboard, which is just the right size for composition at the computer. It’s the perfect width and height to sit in front of my 22″ Dell monitor.
I spent most of the afternoon today getting familiar with the software integration. The Virus hooks up to a PC or Mac with a USB cable, after which integration is pretty much seamless. One of the benefits of the TI series is the flashable OS, so a quick trip to Access’s support section lead me to the latest 2.7.5 release, along with related manuals and video tutorials. The software installed without a hitch, the longest part being the update of the keyboard itself, but after a quick reboot of both my new MacBook Pro and the Virus, I was ready to rock and roll.
Access provides a number of high-quality manuals to get you going, all in PDF format which worked out nicely since I now have a dual-monitor setup at the house. There were also sample projects for popular sequencer packages; I’m using Logic Express. I’ll throw in a quick shout-out for Logic Express here: it’s basically the same pro version that studio musicians use with a cap on the number of audio tracks, but the cap is something ridiculous like 192 audio tracks. After spending around $900 six years or so back on Cubase SX and a sampler package from Steinberg, I can’t believe how powerful Logic Express was for a mere $300, especially considering that the software synths and samplers that you get in the basic bundle are sufficient for anybody to get far off the ground with music production. Although I liked Cubase a lot, I had to upgrade to get Leopard support, and decided to switch over to Logic Express rather than sticking with Steinberg’s product line; I must say that I am glad that I did. I originally made the switch with Logic Express 7, but Apple completely revamped the interface for Logic 8 and it’s now much more streamlined and intuitive than before.
Back to the Virus. Once you fire up your sequencer, the Virus shows up as an Audio Unit software instrument in addition to seamlessly infiltrating itself as the MIDI controller and audio interface. This was one of the fringe benefits of getting the TI: rather than spending several hundred on a new MIDI interface and then dicking around with setting it up, I get a MIDI interface right in the TI, along with audio inputs/outputs for recording and playback. This is great because I can use drum loops and other bits from Logic right alongside sound from the Virus, and hear it all mixed down through the Virus’s headphone jack.
True to form for a German synth, the Virus is built like a tank, and has an excellent quality keyboard with velocity sensitivity and nice clicky aftertouch. The thing lights up like a Christmas tree when you turn it on, and has a number of lights that pulsate to the BPM setting, including a guide light above the LCD display (for usefulness) and an Access logo on the back (for showing off). Note that because of the software integration, the Virus is always at the same BPM as your sequencer, which is totally fabulous: your arpeggiators are always in sync no matter where you are! The pots have a smooth action, and the control surface layout is very logical. I was able to get going with basic editing of sounds without having to crack the manual, which is always nice.
The software integration is really pretty top-notch. Once you fire up Logic, you create the Virus in the Environment as a software synth with 16 MIDI channels. The Virus then shows up in the Mixer as a stereo audio channel, and each MIDI channel for the synth is split across the 16 available slots. The Virus hardware automatically goes in to “Sequencer” mode once you have it established in the sequencer, and then you can start the bi-directional manipulation through the software/hardware feedback loop.
I have been really impressed with the interface on the software. Not only is it attractive, but it’s very intuitive. You can drag and poke just about every control, including grabbing ADSR envelopes graphs and filter slopes and moving them around. All the feedback is instantaneous regardless of which direction you are going in, meaning that the synth and software stay in total sync as you work.
One thing I noticed is that latency is still a minor issue, although they have a good way of working around it. When you are recording, you can set the Virus up to send the audio directly out of the headphone jack, so you don’t suffer from the latency of the audio going from the Virus to Logic and back again. Then, once you have finished recording, you can flip the audio interface back so that it is routed through USB and the latency is handled in the software as usual. The only scenario where this is a little weird is when you are recording multiple MIDI tracks one after the other, since the tracks recorded earlier will be affected as you flip the audio mode back and forth between real-time and USB-routed. A simple solution for this is to either mute earlier tracks as you record later ones. You can also just bounce the audio for prior tracks to disk before you record new ones, removing latency for earlier tracks altogether.
One thing I did notice that required some getting used to was that MIDI muting and soloing happens in the TI plugin, while audio muting and processing happens in a single channel in Logic. This is because the Virus has 16 MIDI channels but only two stereo audio channels, which means you have to choose between the two audio channels when you are routing audio in to the mixer. I don’t see this being a problem in practice since I see myself composing via MIDI and then bouncing all the audio to disk before mixing down, at which time each track will get a dedicated strip in the mixer and can have its own effects in Logic’s audio channels as normal. However, if you were performing live with the Virus, you would have to make sure that you either used the on-board effects for each patch, or made careful use of the two channel strips in Logic for adding software effects.
Conclusion (for now…)
It’s early days yet, but I am really excited with the possibilities my new Virus is bringing to my home studio.
My interest in digital audio and composition started in my teens. After finally getting a real job and buying a bunch of audio equipment in 2002, my interest waned back and forth since it used to be such a pain in the ass to get all the stuff working together (technical problems really stifle creativity), and as a result I had put this hobby on the back-burner for a while. My level of interest recently resurged when I started playing with Logic 8’s new interface, and between the Virus making hardware integration so seamless and the awesome synth engine, I figured I would give music another spin.
While I lack the talent to be a professional musician, I am looking forward to writing sound libraries, remixing tracks from my favorite artists, and maybe even composing a track or two in my spare time. If I come up with anything half-decent I might even share it with the world… :)
If you are in the streaming media industry, you will definitely have heard of Streaming Media magazine and their EVP, Dan Rayburn. Recently, Dan blogged about the StrikeTracker console developed by my team at Highwinds. Dan covers a little history on his post so I won’t repeat it all here, but it’s great to see some of the things that he has written since they are spot-on with regard to our approach.
At Highwinds, we have a strong focus on user experience and meeting end user needs, which is evident by the fact that most of our CDN services and reporting can be provisioned and accessed in real-time by the customer without needing to engage the NOC. So, when Dan released the results of his survey of CDN customer needs earlier this year, we immediately incorporated it in to our product roadmap as a CDN customer “wish list”. One of the key areas that we have been focusing on is to leverage the real-time nature of our CDN platform to deliver as much actionable analytics to the end user as soon as possible – usually within 30 seconds of the data being captured. This is obviously of great benefit to customers who are delivering marketing-driven content and/or working in a live environment where the timeliness of the information produced by the CDN is directly related to its usefulness.
To date, our focus on real-time analytics and empowering the end user through the StrikeTracker console is a trend that has won us a lot of favor with our customer base, and has brought over key customers from other CDN players. We certainly have plans to continue to build upon this strategy in 2009, although I obviously can’t share the specifics without spoiling the surprise (not to mention tipping off the competition)! Let’s just say that if we are successful, 2009 is going to be yet another strong year for Highwinds.
While I’m on the topic, Limelight also made a blog post on analytics recently that I found interesting. The following paragraph was of most interest. (Before I comment on this, I’ll remind everybody that this blog carries the opinions of me alone and does not reflect the thoughts of my employer.)
“At Limelight Networks, currently accumulate and process over 100 terabytes of uncompressed log files each day due to the sheer volume of Internet traffic we deliver globally on our network. We offer byte-level accurate reporting on this traffic – not a sample or estimate, but an actual accounting of each bit we deliver. Reporting and analyzing this data in a timely and consistent basis is no small task when you are delivering massive traffic volumes. This puts us in a unique position to not only innovate, but also provide analytics and insights that few, if any, companies have ever delivered to their customers.”
The blog post is clearly a generalization about CDNs and how their technology scales, which I find largely accurate based upon the industry players that I am familiar with. As Limelight points out, working at the scale of a CDN is extremely challenging, and it makes sense that this would be the case: the reason that the CDN industry exists is that we agreed to take on problems with Internet content delivery that nobody else wanted to, precisely because they are massively hard problems to solve.
What is interesting to me is that in solving these problems, many CDNs have taken something of a “cookie-cutter” approach, using the same (or very similar) network architecture and patterns as those established when the industry was in its infancy. Indeed, this lack of technology differentiation is a major reason why there is so much litigation currently taking place with regard to patent infringement in the CDN space.
The reason why Highwinds is successful is our core architecture, which is significantly different from every other CDN on the planet. For example, Limelight points out in their post that they accumulate and process over 100 TB of log files every day to deliver byte-accurate analytics. I believe that this accumulate-and-process approach is the same one that 98% of the CDNs out there would take.
What’s interesting is that the accumulate-and-process approach is the direct opposite of the way we deal with reporting data on the Highwinds CDN. When you deal with analytics by waiting to accumulate it en masse and then processing it after the fact, you give up valuable time waiting for the accumulation and processing cycles to end. This lag in time doesn’t make for a real-time solution.
So, we do something else entirely to process our data, and we can still deliver byte-accurate accounting and analytics. But unlike the other CDNs, we can do it in real-time, and that means you get the information you need when you need it – such as during your live event instead of after it is over. The nicest part of our solution is that it scales directly with the size of our network, so it will always be there as a feature regardless of how much content we are delivering.
Of course, our differentiated approach doesn’t make dealing with the engineering problems of content delivery any easier. Every time we consider a feature for incorporation in to our CDN, we have to be able to answer the questions: “How do we keep this real-time?” and “How do we make this scale?” Answering these questions effectively without compromising on feature usefulness is certainly very hard. However, our CDN has the huge advantage of being built on a proven and scalable real-time platform to begin with. In many cases, our architecture answers these questions for us, which is a huge plus when you are working to deliver new and exciting real-time features to your end users that other CDNs can only dream about.
In closing, I’m really excited about 2009. We’re taking a step back from StrikeTracker and re-evaluating every feature to determine what we have done well and what we can improve upon, and we have a mountain of constructive customer criticism both positive and negative to use as our guide. We’ve also got some awesome new features in mind that I believe will redefine what users will come to expect from a CDN administration console. I look forward to demo’ing them to you in person at an industry event in 2009.
I got a new MacBook Pro 15″ at the end of November, and just got around to setting it up properly and putting it to use.
Today, I plugged in my SanDisk SDAD-109 ExpressCard reader to transfer some video and pictures from my HD Sanyo camera/camcorder. After I was halfway through the transfer, I got the “Device Removal” error that you get when you remove a card without unmounting it, even though I had not touched the card. A second later the card remounted. It then proceeded to randomly mount/unmount as it saw fit.
In trolling the forums, I saw that I was not the only person with this issue. Apparently this is an intermittent problem, since some people have problems and others didn’t. The issue was not restricted to one particular type of ExpressCard reader – it seems that it has affected people with both SanDisk and Griffin Technology models (and possibly others). One guy said he sent his Mac back for a new one and that his new one didn’t exhibit the same issue.
Anyway… I was able to fix the issue by simply calling AppleCare, at which point they recommended I reset the PMU (which they walked me through). I listed the steps for how to do this in detail on the support forums, so feel free to link over there for the resolution. I can’t find a way to link directly to my comment, so just scroll down to my post at Dec 13, 2008 11:15 AM after you link over (it’s listed under my name).
Something I struggled with for years after I became a manager was how to deal with the metric assload of email I would receive on a daily basis. Over time, I have developed a strategy for dealing with email that works as well for 200 emails per day as it does for 10. Since I know more than a few people who struggle with their email volume, I thought I would share my process. I apologize for the length of this post, but I wanted to explain the philosophy behind the system as well as the system itself.
A full (and constantly growing) inbox is a major source of stress for most managers working in an office environment. The psychological effect of seeing your email grow each day is the same one you would get watching a physical inbox fill with items requiring your attention; you always feel “behind”, unproductive, and like there is no end in sight. There is also a negative impact on individual effectiveness for both the email hoarder and their staff, since so much communication takes place in email these days; you can easily end up missing/delaying response to important items if you are always behind on reading your emails.
Much of what I practice can be gleaned from the basic principles of GTD. I bought the GTD book for $6 in a CompUSA store that was going out of business, and stopped reading it after the first few chapters since I had picked up enough useful info to establish a working system.
So, here it is.
My email/task management system is based on the following principles.
1) Email is for communication, information, and CYA. It is not for task or to-do tracking.
A lot of people fall in to this trap, using email as a way to indicate that something still needs to be done. This is often achieved by setting emails back to “unread” status if they need to be done/referred to later. Other methods include using subfolder structures or color/project tagging to organize email. Unfortunately, since email is a communication tool and not a task management tool, this technique often doesn’t really work.
We’ll come back to how to deal with email effectively, but the essence to take away at this point is that you should not use email in any way to track tasks or to-dos.
2) All emails fall in to three categories: things to do, things to delegate, and things to delete/file for later reference
There’s not much to expand on here. Take a look at your inbox and see if you can find anything that doesn’t fall in to one of these categories; if so, let me know in the comments.
In order to keep your inbox empty (and thus eliminate the “full inbox stress” you are feeling), you need to quickly go through your emails, identify each message’s category, and act upon each accordingly.
3) Your inbox should always be empty, except for when you are filtering emails in to one of the categories mentioned in item #2
If you think this is an impossible feat, think again. I went from having around 500 unread emails at any one time to my current state, which is a near-constant state of inbox emptiness and a feeling of total control over my email. I don’t care how much email you get or how infrequently you look at it; you can use the techniques I describe to get it under control and keep it that way. I used to get 200+ emails a day at CFI, and was able to handle them with ease.
And yes, this does mean that when you start to implement my approach to email management, you will have to burn four hours on a Saturday cleaning out your inbox in preparation for keeping it empty moving forward.
Believe me: it’s worth doing.
The system works as follows. I expect there are variations of this approach that you can use for your own email management based upon the mail client at your disposal, but this is how I do it.
1) Tasks Belong in Calendars
An essential part of the process is to use a calendaring system to take deferred action rather than using your email to track things to do. Every email that results in a “do” item for you should immediately be converted in to either a task or appointment on your calendar.
Note that it is just as important to schedule appointments for yourself to do things as it is to schedule meetings with others, and I have found this to be an important step in managing my own time. The same way that you should always pay yourself first by putting some of your paycheck in to savings before you pay your bills, you should also mark your time off first before offering it up to others for their meetings.
2) Always empty your inbox when mail arrives
As soon as you see that you have email, you need to deal with it and put it in to the “do”, “delegate”, or “delete” category. Usually I find that I stay focused on a work task for a period of time, and then look up when taking a break to see that I have new email.
One of the great GTD principles I picked up is to immediately perform all tasks that can be done in less than five minutes; emptying your inbox falls in to this category. And, emptying your inbox is easy to do: go through all the emails one by one, and read them in their entirety. If you need to reply, do it immediately. As your inbox empties, you will find that you get used to sending short, fast responses 90% of the time.
3) Do small tasks right away
If you come across an email that is a “do” item and it will take less than five minutes and you have the time, do it then and there. If the item requires more time than five minutes or you are about to step out for a meeting, put the item on your to-do list and schedule it for a day in the future. Juggle existing tasks for that day by priority as necessary, pushing lower priority items to a later date.
Immediately delete the email that spawned the to-do, or file it (if you need to refer to it later) and reference the subject line in your calendar task so you can find the email when you are ready to perform the task. Some programs even allow you to link to-dos to emails – even better.
4) Read and discard (or file) informational emails
If the email is informational and requires no action, read it and delete it. If you want to save it for later, file it and forget about it.
5) Immediately delegate tasks. Track completion as necessary
Knowing when to delegate can be tough, but I generally delegate anything that is clearly the responsibility of a peer or one of my reports. The act of delegation is simply a forwarding of/replying to the message with the task that needs to be delegated.
If you delegate a hot topic that requires action, put a to-do on your calendar to follow up with the person you delegated it to at an appropriate junction. Delete the original email.
6) Live in your calendar
Since you have created a bunch of tasks/appointments in your calendar from your email, you have benefitted in several ways already: your staff/peers/customers got fast responses, your inbox is clear (so you don’t feel stressed out any more), and you have a clear feeling of how much you personally have to do. Also, since calendars track date assignments to tasks, you can evenly distribute the load/priority order of the things that need doing.
I find that each day, I filter my calendar view to just the things that (a) need doing that day and (b) were supposed to be done yesterday (or on days prior) but were not. This allows me to stay focused, and quickly decide if I have too much on my plate for the day. If I have too much, I defer low priority tasks to the next day and forget about them until I see them again.
So you might say: “haven’t you just shifted too much email to too many tasks?” Not really. If I see a task continuing to float around, I don’t get upset about it. If it hasn’t been done yet it probably isn’t that important, or I would have done it already, and at least I know it is on my radar. However, I will often reconsider a long-existing task’s importance altogether and either delegate it or just plain get rid of it if it has floated around too long. You will be surprised how many items you think you need to do that either become unimportant with time, or can live with being pushed back because they just aren’t that important to you or anybody else.
There are a few specific tools/techniques I use that really help me stay focused on this approach to my email.
I use OS X Mail, which has Smart Folders. Smart Folders are rules-based folders, which filter your email for you by criteria. I have four Smart Folders set up: “Unread (Work)”, “Unread (Personal)”, “Unread (All)”, and “Flagged”.
“Unread (Work)” is the top priority when I am in the office. Anything that shows up in here is obliterated in to “do”, “delegate”, or “delete” as soon as I see it. Once I clear my work emails, I will clear out my personal emails if any have come in and I have time; if not, I make a conscious effort to look at the personal folder at least once a day since I don’t get as much email there. “Unread (All)” is a great way to obliterate all my emails (work and personal) if I don’t have very many to go through.
Note that my personal Smart Folder consolidates all the emails from all my personal email accounts, so to me they all look the same. Having all my work and personal emails come in to the same mail client really makes staying focused easy; I tried separating them in the past, and found I just ended up getting behind on my personal email. One client to rule them all is the best approach.
“Flagged” picks up all the emails that I flag in Mail (there is only one flag, and an email is either flagged or unflagged). I use this designation for emails that look interesting but don’t require any attention or action, such as links from co-workers about technology articles that are unrelated to immediate concerns. I usually go through this folder on weekend mornings as I catch up on my RSS feeds. However, since these emails don’t show up as “Unread” they don’t stress me out – and if I never read them, nobody cares.
We get an awful lot of automated emails at Highwinds. The biggest offenders here are from our ticketing and bug tracking systems. I have set up Mail and our Exchange mail server to straight delete about 30% of these emails that I have identified to be of zero value to me, and which don’t require my attention or action.
Of the remaining 70%, many of the items are conversation threads between clients and our support team. Mail allows me to group email by thread, but this is a view that generally bothers me. Luckily, Mail remembers my preference to group emails by thread on a per-folder basis. So, I leave all my folders ungrouped except for the one folder that gets all of our support tickets and bug notifications. Then, since these threads are grouped, I can easily delete all the emails for a thread that I have no interest/involvement in, and easily read the entire timeline for the issues that I want to be involved in. Sometimes I will delete an entire thread of 15 emails in one go based upon a subject line, which allows me to make short work of the 200 or so emails I get each day from these systems.
I use iCal for calendaring. It doesn’t sync with Exchange (at least, not until Snow Leopard in 2009), but it has great to-do tracking and a very clean UI. You can also easily make a to-do from an email, although this is a new feature for Leopard and I always forget to do this since I was used to making to-dos the old way.
I maintain two calendars: work and personal. This seems to be plenty to organize my life. I have tried using multiple calendars to differentiate projects before, and found that to be a dismal failure – simple is best.
One of the reasons I can calmly dismiss the majority of my email is that Mail has such bad-ass search capabilities. I can usually search my entire inbox and all my folders in under five seconds, which gives me complete confidence in either trashing stuff or whimsically filing potentially important emails.
I also tend to leave emails in my inbox (in “read” status) if I know that I will need them later; I can’t see them, since I only look at my Smart Folders, and my Smart Folders only show unread items, so as far as I am concerned they are dealt with and they don’t stress me out. I have to leave items in my inbox since I have my trash set up to automatically delete anything that was trashed and is older than seven days. The point is that “out of sight is out of mind”, and since I can find anything super-fast, I can easily look up an old email in my inbox related to a task without having to link the task to the referenced email.
I know this has been a long blog post, so congratulations if you made it through it. I hope you will give my system a try, and/or come up with a variant that works for you. You have no idea how much better you will feel with an empty inbox!
Most Popular Yelling
- Scrolling Large Data Sets in Flex Charts (41)
- Configuring Tomcat SSL Client/Server Authentication (28)
- Fixing "Bluetooth audio failed" Error Message on Mac OS X with Sony DR-BT50 Headphones (16)
- How To Become A Software Engineer/Programmer (15)
- Using Axis's wsdl2java in a Maven Build (13)
- Speak and Spell Samples (13)
- An Objective-C Tutorial for Enterprise Java Programmers (12)
- On A Personal Note (10)
- Abandoning ColdFusion? (9)
- Adobe Says: "Thousands of Developers are using CF 8" (9)
Stuff I Like
- October 2011
- August 2011
- April 2011
- March 2011
- January 2011
- December 2010
- August 2010
- July 2010
- June 2010
- May 2010
- April 2010
- March 2010
- February 2010
- January 2010
- December 2009
- November 2009
- October 2009
- September 2009
- August 2009
- July 2009
- June 2009
- May 2009
- April 2009
- March 2009
- February 2009
- January 2009
- December 2008
- November 2008
- October 2008
- September 2008
- August 2008
- July 2008
- June 2008
- May 2008
- April 2008
- March 2008
- February 2008
- January 2008
- December 2007
- November 2007
- October 2007
- September 2007
- August 2007
- July 2007
- June 2007
- May 2007
- January 2007
- December 2006
- August 2006
- July 2006
- June 2006
- April 2006
- February 2006
- December 2005
- November 2005
- October 2005
- August 2005
- July 2005
- June 2005