Apple should buy Tivo redux

24

Comments

  • Reply 21 of 72
    Quote:

    Originally posted by kim kap sol

    You say all this as though they are facts...they're not. TV could become an interactive experience like I said earlier...it doesn't need to remain a passive experience.



    This has been predicted for over 20 years now. I've given up. To paraphase a quote about artificial intelligence..."Interactive TV is the wave of the future...and it always will be."



    Quote:

    And a recent survey shows that a shitload of people watch TV while they're using the computer. So bringing TV *TO* the computer is *NOT* the limited market you make it out to be.



    I am one of those people. I don't want the two combined. I am using my computer for work and my TV for entertainment. I sit with my laptop on the couch (sometimes) and browse, email, etc. while I watch re-runs of M*A*S*H or some other thing that doesn't require my full attention. I don't see the need for having the running on my computer or having my email on the same screen as M*A*S*H.



    Quote:

    As computer monitors get bigger, I see less and less reasons why TVs and computers should be separate products. Your lack of foresight, mmmpie, makes it clear that there are people like you out there stunting technology expansion/growth.



    First, the largest computer monitors right now are about 30" and cost about $3000. I just bought a 36" SONY (tube) TV for $1000. Second, just because the technology that is starting to go into TVs and DVRs is the same as is in computers doesn't mean that they must be converged into the same device. Third, what you are predicting and asking for has been predicted and asked for (by technology enthusiasts) for probably 20 years. Consumers don't seem to want it. The TiVO example is a poor one because the extent of the interactivity is selecting shows it has recorded and rating things thumbs up/down, maybe entering some information about viewing preferences. Pretty simple.



    I'm sorry, we just disagree on this. Believe me, I thought for many years that this convergence would be the best thing since sliced bread. I don't any longer. Some devices make sense to converge, others do not.



    Now...I can imagine a future where all of my devices collaborate...collaboration vs. convergence is probably the more correct vision of the future. This seems to be what Apple has in mind.



    Finally, none of us here on this board are stunting the growth of anything. I dare say that none of us have any real influence to direct the trajectory or direction of some particular technology development. This is done in the labs of Apple, Microsoft, SONY, etc. and (mostly) by the pocket books of consumers.
     0Likes 0Dislikes 0Informatives
  • Reply 22 of 72
    The DVD Player in Mac OS X was a bad idea then?



    There's absolutely no point in buying two different devices that do the same thing: display images on a screen.



    So you use your computer for work...who cares...most use their computers for games and other forms of entertainment such as chatting and web browsing. Using your computer for TV is not the ridiculous idea you make it out to be.



    There's no doubt in my mind that in 7-10 years, TV channels will be fed to the computer via the internet...this will slowly decrease the usefulness of TVs.



    People will be able to buy monitors and place 'em around the house and channels will be transfered wirelessly to them via the central computer. I'd say we're about 10 years away from that.



    And comparing a super low res CRT to a 30 inch Cinema Display was a pathetic attempt to make me believe 6 inches for 2000 dollars less would be the better deal.
     0Likes 0Dislikes 0Informatives
  • Reply 23 of 72
    kickahakickaha Posts: 8,760member
    Quote:

    Originally posted by kim kap sol

    There's absolutely no point in buying two different devices that do the same thing: display images on a screen.



    Unless you want to do two different things with them.



    ie, work, and off-in-the-corner-keeping-you-company-entertainment.



    Or do you think that multiple monitors are a bad thing, and instead we should just have *larger* monitors?



    Separating mental focus by separating tasks in physical visual fields is a highly efficient and effective approach.



    Heck, I use virtual desktops to partition my mental focus - one for iChat, social email, fun surfing, and another three for various stages of my workflow. Even the desktops are different. I'd *love* it if I had separate displays, but that's a little difficult with a laptop on the bus.
     0Likes 0Dislikes 0Informatives
  • Reply 24 of 72
    Quote:

    Originally posted by Kickaha

    Unless you want to do two different things with them.



    ie, work, and off-in-the-corner-keeping-you-company-entertainment.



    Or do you think that multiple monitors are a bad thing, and instead we should just have *larger* monitors?



    Separating mental focus by separating tasks in physical visual fields is a highly efficient and effective approach.



    Heck, I use virtual desktops to partition my mental focus - one for iChat, social email, fun surfing, and another three for various stages of my workflow. Even the desktops are different. I'd *love* it if I had separate displays, but that's a little difficult with a laptop on the bus.




    I never said multiple monitors is a bad thing...in fact, it would be possible *today* to hang a 30" on the wall with the primary objective to view movies and TV and have a smaller monitor on a desk in the same room or the room right behind the 30 incher for computer purposes.



    The 30" would only be a second monitor...and with wireless bluetooth gizmos, the 30" could be used as a computer.



    Of course, it's possible right now but rough around the edges since no two people could be using the one computer at a time...but as computers get faster and as wireless becomes better, I don't see why one computer could allow for multiple users simultaneously via multiple monitors around the house.



    The computer would handle all your iChatAV/phone needs, movie/TV needs. The computer would act as the computer we all know, a phone, a TV, whatever.



    As a general purpose tool, the computer is *MEANT* to merge existing technologies. Otherwise there would be no point to Skypes, iChatAVs, DVD Players, iTunes', etc.



    Someone could have invented a audio/video phone and make it mainstream and someone would be wondering the purpose of iChat AV..."why would I want to talk and see someone while I'm working? I'd rather use a separate device." "Why iTunes when I can listen to my music through my sound system?"



    I was in fact contemplating the very idea of having a monitor in the next room. My computer is in my bedroom and the living room is right behind the wall my computer is up against. The monitor would have been used primarily for movie viewing, photo viewing, console emulation, but also general computer use. But I'm waiting a year or two so I can buy a G5 with bluetooth, PCIe, and a 30" Cinema Display when they're a little less expensive.



    There's no point buying an Apple LCD without buying the computer to go with it since you barely get any warranty on the screen when you buy it seperately. And I wouldn't be able to set up the living room monitor as a computer without bluetooth. And I'd have to buy a videocard that allows 2 monitors. So I might as well wait until my next computer purchase.



    But anyone could do this right now if they wanted to. It's up to Apple to make the setup more elegant to the end-user.



    Bringing TV to computers is not a stupid idea. It's just too bad most of you can't see beyond 'today'.



    edit: I actually have no idea what the iMac 3 will bring but having all the computer components behind the monitor and keeping the enclosure as thin as possible would bring everyone one step closer to what I've described.



    If Apple can market such a thing cheap enough, people will be able to buy these and hang them in various rooms of the house. Wireless networking and Rendezvous/OpenTalk will make all of them aware of each other...Xgrid in the future will combine their CPU power to act as one computer...each computer could act as a node to something: a phone line, the internet, TV cable, a powerful sound system, a single printer and all computers would share these as though they were connected to it physically. Some of the computers could simply be used as photo frames (iPhoto slideshow), another computer could be set up in the kitchen as a TV and 'recipe book' that voices out instructions as you cook (iRecipes anyone?) If Apple could come up with some good speech recognition software, people could simply voice out a command to a nearby computer to change the photo in the hallway or music that's playing or search for a recipe.



    I think I'll revise the time to 5 years. In 5 years, this stuff will easily be possible. It's only some clever use of Xgrid, wireless networking, and speech recognition and speech synthesis.



    Apple can do it. But Steve has to realize that TV on computer isn't a stupid idea. Once that is done, the computers will be able to populate every room of the house including hallway walls.



    TV, iChat AV, DVD movies, music, etc... in every room of the house (this won't happen until computers are extremely cheap but I can see the beginning of such a set up.)
     0Likes 0Dislikes 0Informatives
  • Reply 25 of 72
    airslufairsluf Posts: 1,861member
    Kickaha and Amorph couldn't moderate themselves out of a paper bag. Abdicate responsibility and succumb to idiocy. Two years of letting a member make personal attacks against others, then stepping aside when someone won't put up with it. Not only that but go ahead and shut down my posting priviledges but not the one making the attacks. Not even the common decency to abide by their warning (afer three days of absorbing personal attacks with no mods in sight), just shut my posting down and then say it might happen later if a certian line is crossed. Bullshit flag is flying, I won't abide by lying and coddling of liars who go off-site, create accounts differing in a single letter from my handle with the express purpose to decieve and then claim here that I did it. Everyone be warned, kim kap sol is a lying, deceitful poster.



    Now I guess they should have banned me rather than just shut off posting priviledges, because kickaha and Amorph definitely aren't going to like being called to task when they thought they had it all ignored *cough* *cough* I mean under control. Just a couple o' tools.



    Don't worry, as soon as my work resetting my posts is done I'll disappear forever.
     0Likes 0Dislikes 0Informatives
  • Reply 26 of 72
    hmurchisonhmurchison Posts: 12,464member
    I don't care much about TVs and Computers merging. I just want Apple to take advantage of opportunity. Tivo is getting hammered, not because they don't have a good product but because Wallstreet's faith in them is lacking. Apple should Carpe Diem.



    "Opportunity is missed by most people because it is dressed in overalls and looks like work."



    Thomas Alva Edison.
     0Likes 0Dislikes 0Informatives
  • Reply 27 of 72
    Quote:

    Originally posted by kim kap sol

    The DVD Player in Mac OS X was a bad idea then?



    No. But you seem to be suggesting that it must be THE way. Surely watching a DVD on my laptop is a great use for my laptop while I'm on the road...but as my primary TV viewing device? Not so sure.



    Quote:

    There's absolutely no point in buying two different devices that do the same thing: display images on a screen.



    Already addressed.



    Quote:

    So you use your computer for work...who cares...most use their computers for games and other forms of entertainment such as chatting and web browsing.



    First, please back your assertions with some facts. Second, even if this is so...the things you speak of are interactive activities where watching TV is typically more passive.



    Quote:

    Using your computer for TV is not the ridiculous idea you make it out to be.



    Didn't say it was ridiculous...but it doesn't appear to be what most people want to do.



    Quote:

    And comparing a super low res CRT to a 30 inch Cinema Display was a pathetic attempt to make me believe 6 inches for 2000 dollars less would be the better deal.



    Don't be ludicrous. YOU were suggesting that today's computer monitors can be used for TV viewing (as you more directly do in later posts). I don't consider this realistic. Secondly, the additional resolution isn't of much use for TV resolution anyway. Finally, most people are not going to spend $3000 for a 30" display for watching TV when they can spend $1000 for a 36" TV or spend $3000-$2500 for a 42" plasma TV...JUST for TV viewing.
     0Likes 0Dislikes 0Informatives
  • Reply 28 of 72
    amorphamorph Posts: 7,112member
    Quote:

    Originally posted by kim kap sol

    Apple can do it. But Steve has to realize that TV on computer isn't a stupid idea. Once that is done, the computers will be able to populate every room of the house including hallway walls.



    Steve answered this with a very simple thought problem: You've set your PC to record The West Wing at a certain time. Your sister decides to work in FCP, maxing out the PC's resources, and overlaps that time. What happens? Should Apple start putting in exceptions to the age-old rule that the interactive process commands the highest (userland) priority? Or do you let the PC be the PC, and let the TV be the TV, and keep them out of each others' way? As Steve put it, if he set a machine to record The West Wing then it had damn well better record it; but that means leaving the person actually using the PC interactively with a laggy and jittery user experience.



    The problem, in this case, is that Apple's software strategy involves squeezing out every last drop of performance from their hardware in order to make it as attractive as possible. So real-time, compute-intensive background processes that aren't related to the current interactive work are bad.



    That's not even touching on copyright issues, which are guaranteed to be 10x hairier with digital content than with the currently prevailing analog content. The content providers want their content to appear in a dedicated, locked-down box that is guaranteed to be the final destination for that content. They don't want someone releasing the Phantom Edit of their show, and they have spent millions to prevent that from happening. For anything like the future you envision to happen, the industry will have to turn 180 degrees around (in a direction where publishers have not been willing to go for the entire history of copyright law), and a few laws will have to be struck down or repealed. That, more than Steve or an unwillingness to have computers perform triage with user requests, will keep convergence from happening any time soon.
     0Likes 0Dislikes 0Informatives
  • Reply 29 of 72
    kim kap solkim kap sol Posts: 2,987member
    Quote:

    Originally posted by Amorph

    Steve answered this with a very simple thought problem: You've set your PC to record The West Wing at a certain time. Your sister decides to work in FCP, maxing out the PC's resources, and overlaps that time. What happens? Should Apple start putting in exceptions to the age-old rule that the interactive process commands the highest (userland) priority? Or do you let the PC be the PC, and let the TV be the TV, and keep them out of each others' way? As Steve put it, if he set a machine to record The West Wing then it had damn well better record it; but that means leaving the person actually using the PC interactively with a laggy and jittery user experience.



    The problem, in this case, is that Apple's software strategy involves squeezing out every last drop of performance from their hardware in order to make it as attractive as possible. So real-time, compute-intensive background processes that aren't related to the current interactive work are bad.





    Well sure...but, and correct me if I'm wrong, doesn't this kind of problem already happen with Fast User Switching?
     0Likes 0Dislikes 0Informatives
  • Reply 30 of 72
    buonrottobuonrotto Posts: 6,368member
    I think some people might be approaching this the wrong way. Let's stick to the idea of a Mac as a digital hub, not fusing the TV and Mac together in the same box, but using the Mac as a sort of tuner, jukebox or, as hmurchison originally pointed out, as a TiVO. Here is your best argument for a headless iMac: having one computer and multiple displays, one for viewing at your computer, one as a TV monitor (as opposed to tuner), and the CPU can be remote from both potentially (bandwidth issues apply!). Now, we might have to think how a computer handles all this stuff: the memory, the badnwidth, the CPU and GPU performance and so forth. You might need a central DB type of hub and satellite processors for TV, computing, music, etc. It seems to me anyway that any sort of convergence would mean that both ends of this proposal would have to move towards one another. Leaving the computer, our definition of it and its implementation, relatively immutable doesn't seem like a good frame of reference for something like this. Computers will have to be more like consumer electronics devices (stereos, TVs, etc.), and these electronics devices will have to be more like computers as people conventionally define that.
     0Likes 0Dislikes 0Informatives
  • Reply 31 of 72
    airslufairsluf Posts: 1,861member
    Kickaha and Amorph couldn't moderate themselves out of a paper bag. Abdicate responsibility and succumb to idiocy. Two years of letting a member make personal attacks against others, then stepping aside when someone won't put up with it. Not only that but go ahead and shut down my posting priviledges but not the one making the attacks. Not even the common decency to abide by their warning (afer three days of absorbing personal attacks with no mods in sight), just shut my posting down and then say it might happen later if a certian line is crossed. Bullshit flag is flying, I won't abide by lying and coddling of liars who go off-site, create accounts differing in a single letter from my handle with the express purpose to decieve and then claim here that I did it. Everyone be warned, kim kap sol is a lying, deceitful poster.



    Now I guess they should have banned me rather than just shut off posting priviledges, because kickaha and Amorph definitely aren't going to like being called to task when they thought they had it all ignored *cough* *cough* I mean under control. Just a couple o' tools.



    Don't worry, as soon as my work resetting my posts is done I'll disappear forever.
     0Likes 0Dislikes 0Informatives
  • Reply 32 of 72
    kickahakickaha Posts: 8,760member
    Quote:

    Originally posted by AirSluf

    1. Mach has real time threads which eliminates the recording performance issue.



    *Eliminates*???



    Um, no.



    If the DVR process wants 60% of the CPU, and FCP wants 80%... you're going to see FCP choke. Unless of course FCP is *also* RT (which, using QT, it is), in which case you're *really* screwed. Both processes choke.



    Since when has real-time every magically increased your CPU capabilities??



    Quote:

    2. To counter background recording killing application performance you give the user the option to override it. The user stays in control of what their priorities are and chooses their desired front app performance impact consciously. The only issues then are on shared machines and that is a social problem, not a technological one--someone is going to win and that is usually the one with an administrators password.



    Explain that to Average Joe. Nuh-uh.



    Quote:

    3. That problem is already here even with the current TiVo and related other recorders. It won't get much worse with access on generic computers as once the first digital pirate is out distribution is ridiculous fast and Pay Per View has gotten smart by showing on the same day the DVD is released, making it a technological wash as to where the originals could come from anyway.



    Actually, the problem is *not* here already because a TiVo is a dedicated device - you're not going to be editing FCP projects on it.
     0Likes 0Dislikes 0Informatives
  • Reply 33 of 72
    Kickaha, Kickaha, Kickaha. The problem already exists. It's not like DVR is bringing some new problem.



    We haven't moved towards a pre-emptive multitasking OS for nothing...it's brought a lot of pros but also some cons. We can now use multiple apps and they all get a share of the CPU (or 2 CPUs). The unfortunate side-effect is that some apps that get less priority will start to run slower than others when the CPU is maxed out.



    Explain what to Average Joe? That burning a DVD and encoding an MP3 at the same time, will make Doom 3 run slower? Who cares. It's up to the user to realize that if there's some DVR going on, FCP is undoubtedly going to run slower. Computer extensive tasks have been around since forever. It's nothing new.
     0Likes 0Dislikes 0Informatives
  • Reply 34 of 72
    kickahakickaha Posts: 8,760member
    Gee, you *don't say*... :P



    The assertion that real time threads 'eliminate' this problem is naive.



    And no, until now the possibility of such a situation really *hasn't* cropped up for the average Mac user. Name me one other situation that has already been widespread, if you would, where you have:



    1) a highly computationally intensive process running in the background owned by user A that demands real time threads (and before anyone screams that it's not computationally intensive - do you really think that with H.264 lying around that Apple would use a lesser quality codec if at all possible?)



    2) a highly computationally intensive process running in the foreground owned by user B that demands real time threads



    No rush, I'll wait.



    That's the situation you're looking at popping up with DVR capabilities on a Mac... I would *love* to have those capabilities, but I also am one of the minority who understand the ins and outs of the system, and potential consequences of running into such a scenario as above. User B, who may really really need that process *now* can't do squat about user A's process, unless B is an admin... and even then, user A is going to be a little peeved that their process was terminated and they don't get to watch their soap opera du jour. This is utterly not the same as the one user launching Doom while encoding an MP3 and burning a DVD - they control all the processes at that point, there are no surprises.



    The closest thing we've had come up so far is fast user switching, but even then you don't necessarily have the scheduling issue, where user A's process suddenly pops up and wants a large chunk of the CPU time, like, now.



    Bottom line: the mere existence of real time threads means bupkus in this situation. They don't magically make the problem go away. That's all.
     0Likes 0Dislikes 0Informatives
  • Reply 35 of 72
    kim kap solkim kap sol Posts: 2,987member
    Quote:

    Originally posted by Kickaha

    Gee, you *don't say*... :P



    The assertion that real time threads 'eliminate' this problem is naive.



    And no, until now the possibility of such a situation really *hasn't* cropped up for the average Mac user. Name me one other situation that has already been widespread, if you would, where you have:



    1) a highly computationally intensive process running in the background owned by user A that demands real time threads (and before anyone screams that it's not computationally intensive - do you really think that with H.264 lying around that Apple would use a lesser quality codec if at all possible?)



    2) a highly computationally intensive process running in the foreground owned by user B that demands real time threads



    No rush, I'll wait.



    That's the situation you're looking at popping up with DVR capabilities on a Mac... I would *love* to have those capabilities, but I also am one of the minority who understand the ins and outs of the system, and potential consequences of running into such a scenario as above. User B, who may really really need that process *now* can't do squat about user A's process, unless B is an admin... and even then, user A is going to be a little peeved that their process was terminated and they don't get to watch their soap opera du jour.



    The closest thing we've had come up so far is fast user switching, but even then you don't necessarily have the scheduling issue, where user A's process suddenly pops up and wants a large chunk of the CPU time, like, now.



    Bottom line: the mere existence of real time threads means bupkus in this situation. That's all.




    What the hell? You're bringing a completely unrelated problem to the scenario...a problem that exist with dedicated devices anyways.



    If person A wants to use the VCR to record a show and person B says "Nuh uh, I want to watch something or record something else" then the same problem exists. If someone wants to watch TV and someone else wants to play Nintendo, one of the two is going to have to give. Get another VCR, TV or computer if conflicts like these occur.



    As computers move towards multicore or multiple CPUs, it will become much easier to have more than one person use a single computer at the same time. But problems like these have existed since ENIAC *and* exist with dedicated devices also.
     0Likes 0Dislikes 0Informatives
  • Reply 36 of 72
    kickahakickaha Posts: 8,760member
    Quote:

    Originally posted by kim kap sol

    What the hell? You're bringing a completely unrelated problem to the scenario...a problem that exist with dedicated devices anyways.



    Read back a little more closely. I was responding to AirSluf, who was responding to Amorph, who brought it up as a point that Steve Jobs made as to why DVR tasks can cause user issues. It's a very real problem.



    Quote:

    If person A wants to use the VCR to record a show and person B says "Nuh uh, I want to watch something or record something else" then the same problem exists. If someone wants to watch TV and someone else wants to play Nintendo, one of the two is going to have to give. Get another VCR, TV or computer if conflicts like these occur.



    BINGO. 'Get another device if conflicts like these occur.' It's called a DVR. That's your 'other device'. Then, no conflicts.



    You hit the nail on the head, even if you didn't mean to.



    I'd love to have DVR capabilities too. I just see that there are some very real resource conflicts that can occur and need to be taken into consideration. And no, real time threads don't make them magically go away. They can help, but they don't make them go away.
     0Likes 0Dislikes 0Informatives
  • Reply 37 of 72
    kim kap solkim kap sol Posts: 2,987member
    Quote:

    Originally posted by Kickaha

    Read back a little more closely. I was responding to AirSluf, who was responding to Amorph, who brought it up as a point that Steve Jobs made as to why DVRs can cause user issues. It's a very real problem.







    BINGO. 'Get another device if conflicts like these occur.' It's called a DVR. That's your 'other device'. Then, no conflicts.



    You hit the nail on the head, if you didn't mean to.




    So...ummm...does this mean we'll never get DVR on computers ever?



    I mean, just 8 years ago, encoding MP3s made everything else on my computer so slow I could barely do anything while it was happening. Heck!!! PLAYING an MP3 was a something that would bog down the whole system.



    Really...I don't understand your lack of vision.



    DVR doesn't need to be something that's available to G3 or even G4 users. It could be something only installable on Dual G5 computers. I'm sure they can handle it with CPU cycles to spare for other apps...today! But saying it doesn't belong on computers and only dedicated devices should handle this kind of stuff is rather ridiculous.
     0Likes 0Dislikes 0Informatives
  • Reply 38 of 72
    kim kap solkim kap sol Posts: 2,987member
    Quote:

    Originally posted by Kickaha



    I'd love to have DVR capabilities too. I just see that there are some very real resource conflicts that can occur and need to be taken into consideration. And no, real time threads don't make them magically go away. They can help, but they don't make them go away.




    Resource conflicts have *always* existed...don't pretend like they haven't. There is absolutely nothing new here.



    The point that you make about DVR bogging down FCP is moot.



    Don't do 4-way iChat video-conferencing while using FCP...it could get ugly. In fact, Steve decided to remove 4-way video-conferencing from iChat 3 because it could cause 'resource conflicts'.
     0Likes 0Dislikes 0Informatives
  • Reply 39 of 72
    kickahakickaha Posts: 8,760member
    Quote:

    Originally posted by kim kap sol

    Resource conflicts have *always* existed...don't pretend like they haven't. There is absolutely nothing new here.



    *sigh* Yes, and real time threads have *NOT* made them go away, have they?



    What part of this aren't you getting?



    Quote:

    Don't do 4-way iChat video-conferencing while using FCP...it could get ugly. In fact, Steve decided to remove 4-way video-conferencing from iChat 3 because it could cause 'resource conflicts'.



    Why do you think they limited iChat to 500MHz G3s? Quality of the video. Could slower machines do it? Sure. Was it 'acceptable Apple quality'? Nope. So they disabled it. The precedent is there... if it isn't 'up to snuff' at the user end, it simply won't be allowed.
     0Likes 0Dislikes 0Informatives
  • Reply 40 of 72
    kim kap solkim kap sol Posts: 2,987member
    Quote:

    Originally posted by Kickaha

    Why do you think they limited iChat to 500MHz G3s? Quality of the video. Could slower machines do it? Sure. Was it 'acceptable Apple quality'? Nope. So they disabled it. The precedent is there... if it isn't 'up to snuff' at the user end, it simply won't be allowed.



    So you agree that DVR can potentially be added to top end computers? And that buying a dedicated DVR device is a waste of money if the computer *could* run a DVR app and FCP simultaneously and acceptably?



    That wasn't so bad was it? TV viewing and TV recording has a place with computers. I'm glad you agree.
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.