making holes and adding color

back in August I walked into the studio and noticed that one of my mics was missing. it wasn’t stolen thankfully but it was sprawled out on the floor. and then I discovered why. SNAP! eventually same thing happened to the other three clamps. I’m pretty sure that this clamp was never designed to be used as a permanent installation. why? they all broke in the nearly same place.
Continue reading “making holes and adding color”

Mac Mini circa 2010

it’s a total work of art. really it is. but there’s lots more than the stark nothing look of it to like. or is there?

pretty much everyone is going to sing “HDMI port! about damn time!” but lack of HDMI port is only a problem if you are using a Mac Mini as the center of a Home Theater. and lots of LARGE screens can be driven using an DVI cable converted. I know it works because I set one up to do that. given the tasks that I would employ it to do this port isn’t a boon. but maybe someday I’ll want to watch content that is protected visually and I’ll actually need this feature. but I seriously doubt that will happen anytime soon.

I was curious about how small the motherboard has to be as the new case is totally short. the old mini has lots of risers and formers allowing parts to get stacked. so I printed the bottom case image as seen on Apple.com and placed real objects on it. it looks like there is lots of room until you consider there is also a power supply inside that case. the PS likely takes out the space to the right of the hard drive from front to back. if the optical media wasn’t in the case the height could be lowered by 15mm.

going by the dimensions of the Mini (which are 7.7″ by 7.7″ and match the AppleTV size) the motherboard would be approximately 7.5″ x 4″. this really packs it in there. although the mobo for the MacBook Air is even smaller. the older Mac Mini is actually an 1.2″ smaller. and it had 1 more USB port.

the video card will always be an issue especially with the Windows crowd looking in. when you look at the Mini (or any of the MacBook line) you see there just isn’t room for anything else. one could argue that there would be room if you lost the Super Drive. but if that happened the case would get thiner not full of another feature. the other complaint that people will make is that 2.5″ drives are not as speedy as 3.5″ drives. but in what context is the complaint? because if you are looking at pure benchmark data those numbers are no where close to how you interactive with your computer. it’s really tough to notice the difference between your favorite app loading on various Macs. in fact I challenge you to feel the difference. you can’t so it doesn’t matter how fast an app loads.

Apple has finally learned that RAM upgrades shouldn’t require a trip to the hardware store for putty knives to crack open the case. after removing the round rubber cover the RAM is right there. interesting is that it looks like a same dual SO-DIMM connector that plagued the aluminum PowerBooks. hopefully this fatal flaw of the G4 won’t be an issue with this version of the Mini.

it’s really easy to be critical of the price but even more so when you start poking around the BTO options. this little computer can even more very expensive very quickly. but there are places where this Mac is the perfect thing. where what why? so my aging MacBook Pro)totype is long out of AppleCare. I’ve thought about the next Mac that will replace it. it’s possible that Mini could fill the bill. especially considering that one could by 3 of them for the price of on MacBook Pro.

  MacBook Pro)totype (2006) Mac Mini 2010
Speed
2.16 GHz
2.4 GHz
RAM
2G (or 3G if mix max’d)
2G (8G max)
HD
640 G (self upgrade)
512 G
Video
ATI X1600
GeForce 320
Firewire
400
800
USB
2
4
     
Price
$2600
$700
     

having two more USB ports would be awesome. it would mean not having to unmount to swap. and getting back FireWire 800 would certainly seem like an upgrade even though I’ve lived without it for so long. I guess I never really noticed. odd huh? the GPU isn’t something that is going to affect my day to day. I do so little with Motion and I’m not doing any 3D work at all.

90% of my job is typing centric. and I’m pretty much very happy with my current setup. which is the MBP connected to a large display that has a aluminum keyboard in front of it. which makes it exactly like running the Mini. there are three tasks that I do where I would like much, much more power: working with video from Final Cut, exporting audio from Sound Track and compressing final content. these are the places were a Octo would rock my work. but I have to question if spending 4X more is worth it in the long run?

and maybe there is for one single reason: perception. clients don’t need to know that the jobs they are asking me to do can be done with the smallest Mac ever made. they might question the rate or my talent because of the tiny Mac. it’s always a good idea to prop up that the things we do are hard and need powerful equipment. it makes everyone feel better. after all, they are the ones paying for it so bring on the Octo!

oh yeah, one more thing. it’s probably a good idea to check out similarly shaped systems just for comparison purposes. Like the Dell Zino HD which makes everything above seem like a pretty good deal.

[ad#720 bottom banner]

WWDC reaction: Back to Work

I didn’t watch the TXT feeds as it came down. it’s largely a waste of time because it’s so eff’n slow. better I think to do real work and then see it all in 30 seconds in the end. so that’s what I did. good thing to because there’s not much here. iPhone 4. that’s nice.

maybe now we can get back to Macs and some real work. where is FCS next? where are the bumped Mac Pro’s? where is the updated Time Capsule? or how about the missing Mac? where is iWork and iLife updates? how come half of the stuff that Apple sells is now 6 to 18 months behind the times? when the most innovative update for Mac in the last 4 years has been the Aluminum keyboard there’s a problem. looking back not much has changed since March.

I’m comparing Apple to the rest of the computing world which has release been releasing some pretty amazing things that don’t cost an arm to purchase. there’s that HP laptop called the Envy for example. the lenovo X series. even Dell has some smashing products. but what I’m in awe over is the desktop class computers for how cheap they are and how they out spec a Mac so handily. of course it’s running Windows but so what. it’s the Apps that you are running that make the difference in the end and for some of us it’s the same zact applications like Photoshop, Illustrator, After Effects and a sound editor.

I’m not saying that I’m switching today but I my eyes are noticing that the grass really is greener over there.

about the petrified Clementine

at the end of the Maker Faire 2010 conversation is the first “after show” since we restarted. John talks about making a “brain” by drying out a bad Clementine. the only thing you have to do is rotate the “cutie” as it’s drying. it takes about 3-4 weeks for the fruit to dry out. the best part about this “project” would make making it into something else afterward.

the “car” toons John was referring to are the Ugly Stickers, Rat Fink or Odd Rods trading cards and stickers.

More about the Clementines.

it’s good to be back… thanks everyone

hi.

when we ended the show MacBreak Tech I would say, “we’ll see you on the other side.” I’m not sure exactly what I meant by that. Portal wasn’t invented yet so it wasn’t that. maybe it was a someday we will see you “in person”. or maybe I was thinking about switching to Windows. in hindsight I think what I meant was “some day we’ll make something even better and we hope you join us there.” so here we are just over a month old after a great big reset.

I want to take a few paragraphs to thank everyone from the listeners to the cast for all the help and support you’ve given to the show/podcast/blog/platform that is Know Tech. it’s a better day because I get to work on the content here. I’m having a blast making, researching, editing and planning what’s next. there’s so much to learn. the interesting thing is that all of that will end up on the feed or as an article. even the bug to fixes can become content. ha! our mistakes are your benefit.

it’s the little things that come back from you (aka the listeners) that makes it totally worth while to do. like hearing that you support my wonk vision of what “show” should be. that you like the “un-format” of something that has no episode numbers, introductions or show stopping ad drop in the middle right when it was getting good. I don’t like these things myself so why would I put them in a show that we create? I’m happy that you’ve written in asking questions that sometimes even became a conversation. seeing feedback no matter what it is on the twitter or in email helps make the show your show.

I’ve been inspired enough by all of you to work to continue to improve the quality of what we produce. I ordered a bag of chips (no not those chips) so I can finish a microphone project. which will become another article (see how this works). I’ve looked into doing other build projects like the G4 NAS Drive that reuses old computers in a new way.

finally, I’m happy to say that there will be new voices in the coming weeks. I’m working with Jordan who happens to eat and breath games. Chris White got his mic sorted so we’ll hear his take on tools that will help your creative endeavor. Tom Mahady has his Makerbot doing his bidding making whatever he thinks up next. and I’m looking forward to have Ben Durbin and Kanen Flowers behind the microphone as well. both of them are great explainers. and don’t forget the special guests!

thanks again for listening. keep those comments and twitters coming. we’ll keep making because I promise you there’s lots more where that came from on this side of things.

John Foster
@knowtech

[ad#720 bottom banner]

has anybody jumped platforms lately?

lately I have this “grass is greener” feeling. that I’m missing something by not dividing my time between the platforms more. and I’m getting this even though I spend time in Linux (server not desktop), Windows XP, and 7 throughout my week.

I’m mostly happy with my work flow but from time to time I look at other tools to do work. I feel that it’s important to stay current. and if current means leaving one platform for another I’m all for it. for example Final Cut and Sound Track are tools that I use in my day to day. but FCP really bugs me lately because it feels like an app from a decade ago. and Sound Track seems nice until you have to wait for Wave Forms to draw again for the 3rd time. that means I’m looking at Adobe, Avid or Sony as a place to do media work. it also means potentially disgustingly cheap hardware. although I will spend the money to re-fan the thing so it’s quiet. I’m not in a hurry to make a jump because NAB is around the corner. this almost always brings a new version(s) of everything to the table.

unlike router software which has no impact for it’s use OS or tool switching has the devil you know factor to contend with. it’s pretty hard to undo years long learned habits and twitches. thinking at the speed of think is pretty important. it’s no fun to have to stop a thought to look up how to make it happen. but some things are the same no matter where you do that work. editing, writing, photo retouching, uploading are all those things. although the Control placement on the keyboard makes some things feel awkward (no problem you say, just make CAPS lock into CTRL – done!) and this kind of thing that makes it even possible for me to consider something else.

I’m not going to ditch to another place entirely but I’m wondering if anyone has and why.

another benchmark tool… sigh. just what we needed.

another benchmark tool…. sigh. just what we needed.

actually, there can never be enough tools like these. each of them generates some numbers. sometimes there is a graph to go with the number. sometimes the numbers the tool generates will be different on each machine you run it on. and what is weird is weird about that is you can have two machines that are exactly the same yet both will generate different results. so right out of the gate there is a hint of skepticism generated. what numbers do you believe? certainly NOT the numbers in the brochure created by marketing. you need tools to generate your own tests or those run by uninterested 3rd parties.

benchmark tools weren’t created to lie to us. but sometimes they are used that way unintentionally. we all know that a faster is better, that bigger is better and that smaller and slower are not measures of success. marketing wants to present the best numbers possible. thing is, I’m really not interested in the numbers that marketing has to offer. because what they promise isn’t the same as I what I see in my day to day use. what we should avoid is generating “Frame Rate Bragging Rights” and focus on the question “can this tool to do the job I need it to do?”

the thing with testing is that it’s subjective. all of us carry some baggage into a test because we already know a few numbers going in: we know the theoretical top speed of a SATA2 drive, the number of triangles the GPU can fling or the clock speed of the CPU. a Windows users may bring opinions about using a Macintosh the same way that a Mac user might be shocked (and unbelieving) that the cheaper PC is actually faster and better. the things known however don’t translate into what is really happening.

there is another kind of subjective which is influenced by time of day and where you are. I know from testing colors on displays and printers that what I see at 12PM is very different from what I see at 12AM. our eyes lie to us depending on the kind of light and there is. what looks white in one environment will turn pink in another. it’s all about our brain knowing what white is supposed to look like so it corrects what you are seeing to make it so.

having tested and reviewed video cards I know that it’s tough to come up with meaningful commentary about the new thing. if you take old video card out replacing it with the new kick ass thing the first thing you notice is NOTHING changed! Windows still draw, menus still pull down, it’s as if you wasted $300. the robot scroll test [scroll from the top of a long document to the bottom] gives me an idea of screen refresh but the iWork apps don’t present data that way anymore. it’s more likely you are working from marked up paper going from page to page.

sometimes the experts spout off before they’ve done any actual testing. the recent example is the new integrated nVidia graphics processor found in some of the new Macs. just because its integrated doesn’t automatically it’s graphics are going to perform in a bad way. the “marketing” on Apple.com/iMac shows there is an improvement over the last iMac that had a “real” graphics processor. granted that GPU was kind of anemic to begin with. so it’s not really comparing Apples to Apples. even on the PC DIY side the nVidia motherboards with integrated graphics kicked butt compared to Intel’s offering. but everyone “knows” that “integrated sucks” so even though there was an advantage it was automatically dismissed by the “press”, “reviewers” and “geeks” because a graphics card is always better. never mind that it isn’t the case.

to make testing not subjective means that you have to have lots of different ways to test things. copying a file using the Finder and timing it with a stop watch will get you variation because of the human factor. copying files with a script will get you consistency in the test but who copies files using a script in the real world? using a binary that copies theoretical data is NOT the same as moving a folder that has text, video, photoshop files, fonts and whatever else is there. if there is an advantage that a new GPU core has that can be gained with a specific patch it might not exist… yet. there may be unrealized performance. this has been the case since forever. in fact sometimes there is no improvement because of the way the software was programmed. Castle Wolfenstein was frame locked to 60FPS. so even if you had the better X800 card it wasn’t going to frame any faster compared to the 9800. and that pissed off lots of people looking for FPS bragging rights.

one problem if using Applications for testing is that the results are revealing something else entirely. I’m surprised that Apple is using Motion as a benchmark. mostly because Motion is far from taxing the GPU. most of Motion’s particle effects are made with just a few triangles. meaning you’ll get the same results on every Mac you try the effect on. Motion was designed this way intentionally. and if you “push Motion to the limit” you are far more likely running the card out of memory instead of running out of triangles or pipelines. which means you are testing pushing stuff across the PCI bus and not taxing the GPU.

be wary of the X claims. I really have to question the 1.X, 2.X, 3.x faster claims and a graph showing the difference. let’s say it did 2FPS before and now it does 4FPS. that’s 2X faster right? see how this is misleading? and why is frame rate the measure of speed? if you are measuring hard drive speed you should be able to stop at the SATA drives top transfer rate. thus a SATA2 drive is 2X faster compared to a SATA drive. yet so many other factors come into play here from the controller, what’s being written and what else is going on.

the better benchmark is finding a frame rate like 30FPS (or whatever FPS you choose here because we’re looking for a good better best result) that runs on all the Macs. now turn on Anti-Aliased Edges, Fog, Shadows and other effects until 30FPS can no longer be achieved. then SHOW the screen shots. the “faster” GPU should be able to provide a better looking (visual) play experience.

in the case of video I don’t care how fast it goes as long as it can sustain the data rate promised. DV video is data locked to 3.5 megabytes a second. it would have to be a pretty pethetic RAID not to be able to keep up with that. having a faster bigger drive won’t make that video play any faster (that would be weird?). so video that is 720P and 1080P have data rates that as much as 57X more demanding! do I care about the numbers or is answering the question can you sustain that or not good enough?

another problem with benching with programs like Modo, Cinema4D or Motion Builder is that the GPU improvements are measured in 1/10th of a second or less. the user may report (or not) that they feel an improvement with a faster card. most of the time this false. I did a test were I told people “try this, now try that… okay this one has this card, that one has that card. that card is faster. now try this, try that. which one is faster?” everyone picked the “faster” box. the thing was both had the same video card in them. part two of the test did the same thing only this time a faster card was present. half were told the slow card was the fast card and the other half were told the truth. everyone once again picked the machine “with the fast card” as the faster Mac. amazing test. surprise right? everyone always buys into marketing.

finally, a tool like Cinebench only shows the potential of the card when it’s slammed to the wall. but nothing does that in day to day use. games sometimes take advantage of the GPU to the max but most games are written to work on a wide range of Macs not a specific top end model. we’ve seen some paint programs that take advantage of the GPU to do effects. but it’s the user not the GPU thats the slow component here.

benchmarks are tough things to interpret. sure, the “to the metal tests” are interesting but it’s only one perspective. its comparable to saying “this how fast fast it can go downhill with a tail wind driven by our best driver.” I want to see tests that use the same function calls for open, closing, writing and reading files that the app uses as there are rules that apps have to follow allowing them to share a code base allowing them to work on G4, G5 and Intel processors. without the abstraction the programmer would be buried under keeping it straight. thing is, those functions aren’t as fast as writing to the metal. and I’m fine if its a robot that doing the testing. because it’s just a number in the end that will tell me something about the system in front of me. in the end I really want to see benchmarks that reflect the way Applications work. not they some numbers with a sideline text that says “longer is better” or “shorter is faster”.

the big wrap up of this is that I can make anything slow. you know, load up a few more layers in After Effects, turn on fog, add another ray in the tracer or make 24 simultaneous copies as in start one start another repeat. that always works faster. the flip side of benchmarks is this reality “Ohhhhhh, 8 core 3GHz processor with 16G of RAM! we’re typing fast now!”

what’s missing from this picture?

I was asked the other day by several different people questions that started out “when do you think the new….”

MacBook Pro with new mobile i5
Mac Pro with those new Intel CPUs
Mac mini
iPod Touch with camera
30″ display or even a 27″ display
AppleTV
Airport / Time Capsule / Express
Final Cut Pro
iWork
iLife

see the trend here? lost of people around me are noticing that the whole entire product line is growing more in need a refresh as each day passes. it’s seems really odd to me that all this stuff is on hold. the biggest one is the Mac Pro. but without the numbers it’s really impossible to see if it’s really worth while for them to care. the displays not being upgraded is easy to understand. but every Mac can talk to one so why wouldn’t they get respun.

software is like printing money. every release brings a wind fall. so what’s the hold up?

it’s not because “all hands” are building the big thing. Apple is very divisionalized in a way that keeps everything profitable. the few exceptions are in iTunes and Mac OS. because both touch every single product in the line.

in the past we’ve see things just show up on a Tuesday without any prior fanfare, warning or rumor mill mongering. so maybe that’s what will happen sooner. I’m sure that all of these things and more are in the pipeline. and at this point we’re 3 months past the radio silence that the holiday brings. so where is this stuff?

re-restart

it’s kinda like getting a new pair of shoes. they feel funny because the aren’t broken in. they smell funny because they are new. but they look cool. to get an idea of how it looked I imported the content from the old site. I’m going to delete most of it because it’s old. but this is what it looks like. I have to tweak stuff still, get the logo in there, fix the excerpt text, get the podcast relisted, fix lost pictures and a slew of other stuff like changing the blue links to not blue links. and over time I’ll add more features.

I also have to re-style the forum to look like this minimalist. but overall I’m very happy with how this looks!

my other thoughts? well, hmmm. this is a really different direction then what I was making originally. it’s flattened out. it’s article centric rather than being product centric. this really opens it up to whatever. the only thing I really want is to be able to pick post pages so that things match up better in terms of advertising. and while that’s there now I’m even think of having that disappear. what?! no verts!?! well not in the same sense. my thoughts are that we should concentrate on the content as being the advertising. for example if the subject is web hosting links to that can be to whoever is an affiliate. a good example of how that would work is the Mackie 1202 recording (posting soon). at the end of the discussion you go “oh, I want that and click buy!” remember that we are talking about things that we use. so it’s not like we are plugging to plug like other shows.

accounts are next. I just found a plug in to do the writer, editor, publisher work flow. which will save all kinds of headaches.

it’s restarted. phew.