Vienna Symphonic Library Forum
Forum Statistics

190,095 users have contributed to 42,706 threads and 256,980 posts.

In the past 24 hours, we have 4 new thread(s), 37 new post(s) and 39 new user(s).

  • 64 bit comments

    Hi,

    i came across the comments below concerning 64 bit operating systems by a photoshop developer thought they may be of interest.

    Julian

    "Photoshop co-designer Scott Byer said Thursday that his team fully intends to launch a 64-bit version of its popular image editor, but that doing so for the upcoming version included with Creative Suite 3.0 (CS3) would be impractical.



    Responding to questions from users about a lack of 64-bit support in the latest Photoshop CS3 beta, Byer in his official blog pointed out that many of the perceived benefits of 64-bit computing simply won't manifest themselves with current generations of hardware and software.

    Byer said most Photoshop users are still running operating systems that only support 32-bit memory addressing for each program -- including Mac OS X Tiger, which can only assign 3GB per application. This, he says, eliminates the primary advantage of 64-bit technology: memory addressing beyond the 4GB barrier inherent to 32-bit software.

    "Let's check all the 64-bit hype at the door," he wrote. "[64-bit apps] can address a much larger amount of memory. That's pretty much it. 64-bit applications don't magically get faster access to memory, or any of the other key things that would help most applications perform better."

    In fact, Byer added that most of today's computers would actually incur a performance penalty as the code -- which is literally twice the size when accomplishing the same task -- would bog down the memory subsystem, reducing the amount of information that could pass through at any given time. Contemporary AMD and Intel processors only occasionally stand to gain from 64-bit code and often see their advantage negated by file caching.

    The Adobe developer particularly rules out Mac development of a 64-bit edition of Photoshop CS3, blaming Tiger's fundamental 32-bit restrictions despite its selective 64-bit elements. "Many of the libraries an application would need to be fully 64-bit aren't available. So, on the Macintosh side," he wrote, "the answer [to the likelihood of a 64-bit version of Photoshop CS3] is zero."

    While Byer says that he would love to update his company's star program and take advantage of more than 4GB of memory, he admits that the time spent on 64-bit technology would be better used for polishing the Universal Binary for Mac users and adding features that would be more immediately appreciated by artists looking to upgrade from earlier versions. However, he promises that a 64-bit edition is all but inevitable when more computers start using the greater memory space.

    "It's a when, not an if," he wrote."

    From appleinsider.com

  • it's good to notice serious developers are also talking about the cons and restrictions publically and not blindly jump on to a hype ...
    christian

    and remember: only a CRAY can run an endless loop in just three seconds.
  • This is why it's important to study the developer responses as various builds of Leopard are seeded.

    Still, 64-bit apps (imho) are at least two years away from being a reality on the Mac platform. There seems to be quite a serious limit here-- CPUs are hitting a 3GB brick wall, but are multiplying like rabbits-- if the CPU won't increase in speed substantially, then Apple opts to put more of them in one machine. Very odd in a way.

    But other things will have to improve to get the best out of VIs-- hard drive speeds and transfer rates as a standard would be my first desire for improvements. Bus bandwidth would also be among the most important improvements.

    While I'll never expect that one computer will run much more than a handful of instances of VIs needed for a full orchestral realization, it remains a little strange that more can't be done on one computer than is currently possible.

  • Actually the use of multiple processor cores is not "odd" at all, but a result of the the physics of processor design - - the limits on processor speed vis a vis energy consumption and heat necessitated finding another approach which is the use of multiple processors. Parallel processing implemented by the use of computer clusters has been in use for quite a while as a means of creating relatively inexpensive supercomputers - - putting multiple processors in one machine creates a mini "cluster." Of course, software needs to be designed to make use of multiple cores, otherwise the OS will assign each program to one core only.

    On the question of 64 bit software: Apple claims that Leopard will be fully 64 bit (but able to run 32 bit applications). In light of this, it seems logical that Apple would like to be able to release 64 bit versions of its Pro applications (e.g. Logic Pro, Final Cut, Aperture) along with or soon after the release of Leopard. These are applications that are memory intensive, so that making them 64 bit would (assuming that the updates were well executed) represent a major increase in functionality. Release of 64 bit versions of these applications would also give Apple the chance to price updates expensively - - in the manner of the $299 update from Logic 6 to Logic 7.

    I'm not sure what the point would be if an 8 processor core machine equipped with 16 or 32GB of RAM would be if it could only run a "handful of instances of VI's." If that were to be the case, one might be better off with several Mac Mini's or inexpensive Windows XP machines.

    A few years ago, I could not imagine being able to play 64 or more channels of EXS24 and Gigastudio virtual instruments,(together with five instances of Altiverb and several of Logic's own plug-ins) using a music notation program (Finale) as the MIDI source (on the same G5 with Logic 7) while simultaneously bouncing the "performance" to an audio file in Logic. Yet starting with the advent of dual processor G5's, OS 10.3 & Logic 7 that's just what I've been able to do. This setup has (at least on the Mac side of things) NEVER crashed, while, with my old G3 running OS 9 and performing infinitely simpler tasks, I felt lucky to get through the day without at least one crash. In other words things do change!!

    As far as Adobe is concerned, they are quite late with an Intel native version of Photoshop. It is also very much in Adobe's interest to downplay the significance of 64 bit software when they are far from being ready to release a 64 bit version of Photoshop.

  • Steve, I don't disagree with you at all. The technical reasoning is very clear. What's not clear is when the heat/energy hurdle will be overcome-- or why it's taking a while to get there. In the meantime, the mini cluster solution is the only way to go for the moment.

    Likewise, I also agree with the software quandary: hardware can do *more*, in layman's terms, than the software can. Photoshop has been my personal benchmark app, particularly on the Mac platform-- so much has fallen in line behind PS developments that one can almost set their watch with software trends as PS takes its stand.

    You also hit on something about which I've mused elsewhere-- and that if any company would get apps to 64-bit it will be Apple. If it does no good to have a powerful computer when software won't take advantage of it, it follows that a company like Apple would follow their hardware pioneering with fully optimized versions of their own software.

    Any allusions to the oddness of all this has less to do with it being shrouded in mystery and more to do with how awkward a situation it is with hardware being currently a bit ahead, so to speak, of the software.

    That will change soon enough, and perhaps Steve Jobs will have much more to say about this next week since his last keynote placed 64-bit processing as a headline with no real mention of its pro apps taking advantage of this just yet.

    64-bit aside, just getting 32-bit apps completely recoded for Universal Binary and optimized for the increasing sizes of CPU clusters will be a most intersting development to watch.

  • last edited
    last edited

    @Another User said:

    'm not sure what the point would be if an 8 processor core machine equipped with 16 or 32GB of RAM would be if it could only run a "handful of instances of VI's."


    And if the 16GB of RAM continues to cost $4000.

    But I have to repeat that running both plug-ins and stand-alone instances of instruments is the shiznit. G5s may be slightly underpowered compared to the latest Intel machines, but I can't believe how much I'm able to get out of a single 2 x 2.5 with 8GB of RAM in it.

  • last edited
    last edited

    @Another User said:


    G5s may be slightly underpowered compared to the latest Intel machines, but I can't believe how much I'm able to get out of a single 2 x 2.5 with 8GB of RAM in it.


    Nick-- can you post your instances and sample count? Or, if you've already done this, can you point me to a thread? I have the same CPU and RAM as you, but I'm a little underwhelmed at what I'm able to get loaded in at once.

  • Nick:

    Unless history is no guide, RAM prices for ECC FB RAM, now extraordinarily high, are likely to come down. I'm old enough to remember when 16MB chips were priced at $1000 + per chip.

  • "...can you post your instances and sample count?"

    I don't know if Nick ever posted these specifically. Search for the following thread for the most detailed overview I can recall: "Vienna Instruments Plugin + Standalone and Logic??" I tried to copy and paste the link, but it wouldn't take (operator error, probably).

    Nick, I have a question for you too. With your RAM maxed out, what kind of CPU hit do you take on a 2 x 2.5 when the sequencer is running with nothing playing back? As I mentioned in another thread, just loading the RAM seemed to require a lot of CPU while running, even when the SPL was passing over no regions. So upping my memory was a mixed blessing: I could have more samples on line, but I could playback even less than at a lower RAM config. I'm on a 2 x 1.8, so maybe things get exponentially better at 2 x 2.5.

  • I'll post when I get a chance to load it all up again. What I can say is that it was playable - the CPU wasn't gagging.

  • last edited
    last edited

    @Another User said:

    I'm old enough to remember when 16MB chips were priced at $1000 + per chip.


    Hey you young whippersnapper, I'm old enough to have paid $400 to upgrade my Mac Plus from 1MB to 4MB. And that was after paying $650 for the 30MB hard drive that went with that "insanely great" system. [:P]

  • Ix VSL Stand-alone + Logic

    <a href=http://homepage.mac.com/virtualinstruments/.Pictures/1X%20VSL%20stand-alone%20+%20Logic.png">

    Logic, 3 Viennas, K2, other stuff

    http://homepage.mac.com/virtualinstruments/.Pictures/Logic,%203%20Viennas,%20K2,%20other%20stuff.png

  • Nick--

    You are da MAN!! Thanks!

    J

  • "...the CPU wasn't gagging."

    By comparison, my CPU meter shows 35% use on the left (first core) and about 20% on the right (second core) after I hit play with nothing performing. And that's with only one GB in VSL server and one-half GB in Logic, and a remaining 3.19 GB reported as "free" in Activity Monitor.

    When I loaded RAM closer to Nick's numbers, I was seeing a 50 to 60% hit in *both* CPU's as soon as the transport was running without any music. It was a surprising loss of CPU headroom just because the RAM was loaded, irrespective of playback limits, bus speed, disk speed, etc. I was paying a CPU penalty just for getting more sounds online. Within a handful of cross-faded layered tracks, or after maybe ten velocity-switched tracks, Core Audio overloads would stop playback.

    I think in 12 to 18 months, this will all be moot.

  • Yes, I'm looking forward to hardware and software improvements myself. An Intel tower or two are on my must-haves for 2007.

    I'm also eager to see how Leopard impacts Cube updates...

  • last edited
    last edited

    @JWL said:

    ...if the CPU won't increase in speed substantially, then Apple opts to put more of them in one machine. Very odd in a way.

    But other things will have to improve to get the best out of VIs-- hard drive speeds and transfer rates as a standard would be my first desire for improvements. Bus bandwidth would also be among the most important improvements.

    While I'll never expect that one computer will run much more than a handful of instances of VIs needed for a full orchestral realization, it remains a little strange that more can't be done on one computer than is currently possible.


    The "multiplying" of CPUs seems pretty much unavoidable to me -- unless they come up with a fundamentally different paradigm for processor design. There are physical limits to how far the current approach can go (power consumption vs heat dissipation), which we first saw when the early 90-nanometer chips hit the shelves. It was a bit of a fiasco, really...

    But the RAM thing is still a strange and interesting problem to me. I think you're absolutely right about hard drive and bus speeds being the primarily focus for the work we do. Some of the advances in flash-based drives, like Samsung released back in the summer (I haven't looked into this lately... any improvements?), are quite promising. IMO, we're kind of looking for the wrong solution by wanting all our samples to be loaded into RAM. The speed of RAM today is necessary for *calculations*, but is basically overkill for playing back samples. Yes, when you've got loads of samples to play in that time it's a somewhat different story, but that could also be dealt with using a flash-based HD on an appropriately designed bus (don't know the numbers, but PCIe is probably already capable). But if you keep in mind the fact that all the samplers we've been using for the last few years have been "streaming" samplers anyway, then it's quite clear that it's only the first few milliseconds of playback where there's a major issue. With a device that could deliver in those first milliseconds (i.e., the next-to-zero seek time of a flash-based HD) we should be able to stream to our heart's content. Anyway, I don't think loading more and more into RAM is the only, or the best, way of improving sample-based work.

    The other solution, which I mention whenever I get the chance, is to merge the sequencer and the sampler at the lowest level possible. I keep pestering everybody with a VSL-designed and created notation-based sequencer because I genuinely think that will be the best solution. The fact is that, except for those times when we're literally *playing in* a new part, we absolutely *do not* need all those samples to be loaded in RAM! For the most part, the incredible speed that RAM is capable of is wasted on storing samples for music that is played back in precisely the same way everytime we hit the "play" button. If VSL authored a sequencer, even if it wasn't notation-based, they could address the preloading of samples based on the information contained in the sequencer track. In the simplest possible model, they could create a system whereby, once a passage is recorded, a sample list is created, with timestamps for sample preloading. The preload timestamps would be placed an appropriate period before each note, say 50 milliseconds (overkill, probably) and the sample head would be buffered in that window. I built a system like this in MaxMSP and Java which actually worked quite well (though, not being a sequencer itself, it needed to analyze midi files to build its sample list), and those are definitely not the best-suited languages for this kind of work! (In a more sophisticated version, the timestamp could be calculated dynamically, in order to more efficiently distribute the workload in busy passages.) But of course, if you don't know what note's coming, then you can't buffer the sample to play it, so unless VST is given a view on the entire sequence (does VST 3 support this?), then we won't see this *extremely* simple solution implemented, since plugins remain completely blind to the future.

    Anyway, the point is that unless we plan on having 50 keyboard players on stage, all playing a single VI each, *live*, then there's absolutely no need to have all those samples playing from RAM on each and every pass.

    J.

  • ...of course, i realize that the "solution" I proposed above is basically the same as "freezing" tracks, and that then the real issue would be reloading the samples for live playback/recording. But that's where a flash-based drive would come in, making the buffering of samples for live playback *much* faster.

    Obviously, I don't have a final solution. I'm just dreaming and hoping, like everyone else...

    J.

  • i'm considering the hybrid-drives (legacy rotating magnetic disks combined with flash) as a dead end even before it is released. flash-memory has a limited number of write-cycles (80.000 - 150.000) and will become unreliable after a certain number of cycles. very nifty algorithms had to be used to spread the write-operations evenly across all available memory cells.

    more interesting would be millipede-storage (originally announced for past summer) http://en.wikipedia.org/wiki/IBM_Millipede offering data throughput in the gigabyte range - an ideal medium for read-only data like sample libraries.

    we will have to wait how latency values for millipede will be in real life, because latency of flash-drives are great for little buffer-sizes, whereas the size of storage is too limited (currently 8 GB)
    christian

    and remember: only a CRAY can run an endless loop in just three seconds.
  • Yeah, the millipede looks great! I think you pointed me to that article once before... (It sounds familiar anyway.) I didn't mean to suggest that flash-based drives were necessarily the best way to go, just that something other than rotating magnetic discs would probably pose a way forward. But it's nice to have some details! Thanks.

    J.

  • Plowman, if I remember right I had a sequence playing when I did one of those tests. (This was a few months ago now.)

    I wonder why your machine isn't behaving the same way. And to tell you the truth, I'm not as excited about 64-bit access as I once was, because I'm able to access all the affordable RAM I need to access right now. Maybe the Intel Mac RAM will come doown, but somehow the idea of spending $250/MB (vs. the $75 I paid for the 8GB in this machine) to acccess still more RAM on a single machine is uninspiring.

    A 2x2.5 G5 with 8GB and a SATA card for more storage may be 60 computer years old (1 year = 20 computer years), but it's still a very, very serious DAW. I bought this machine in April 2005; historically machines have been feeling very old by the time they get 1-3/4 years old, but not this time. I've been upgrading machines every couple of years since the mid-'80s, so this is quite a change. Part of the difference is that we're not using just one machine anymore, of course. And then Mac World is coming next week, and NAMM the week after, so it's also possible that this will sound silly. However, the point remains.

    In any case, I like running stand-alone instruments outside the DAW on the same machine very much. To me it's not a disadvantage compared to having everything inside the DAW.