Tesla hires architect of Apple's Swift as VP of Autopilot Software

124»

Comments

  • Reply 61 of 70
    nhtnht Posts: 4,522member
    asdasd said:
    Rayz2016 said:
    Actually, that's the most uneducated guess I've seen so far.

    Project Titan remains an unknown skunkworks project surrounded in rumours. Is it failing? Can't really say because Apple hasn't actually said its working on it, or what it's end goals are.

    When Avie Tevanian left Apple, OSX was nowhere near finished; oddly enough, the OS is still being developed as far as I can tell. Tevanian wasn't working all by himself, neither was Lattner and neither is Ive. 

    There is an assumption that everyone at Apple is irreplaceable; not the case. The only person who they can't seem to replace is Mansfield, and that is a worry, because he will want to retire at some point.

    No comparison, because Swift isnt finalised as a language yet. It is still basically in Beta. ITs like Bertrand Serlet or Tevanian leaving during the 4 year beta of OS X.

    I mean he started this internally on his  own bat - was given that opportunity by Apple, created a language. And generally most people would see that as their life's work. But Musk clearly has his own reality distortion field. 
    Swift 1.0 2014
    Swift 2.0 2015
    Swift 3.0 2016

    How is Swift V3.0 still in beta?  Because Swift isn't a ISO, ANSI or ECMA standard?  Then python, perl, objective C, Scala, etc are still beta.

    Avie left during Tiger (5 years post launch). Chris is leaving during the equivalent of Panther (3 years post launch). 

    Neither left during "beta".  
  • Reply 62 of 70
    nhtnht Posts: 4,522member

    bkkcanuck said:
    1st said:
     Hopefully, his background of both hardware/software and compiler will benefit for self driving car... complicated computation and data structure (sensors) need someone with overall view of the control... Best luck Lattner - be a star, not dust. 
                          ,
    The issue is that from what I can see is that he has no background Machine Learning / Artificial Intelligence at all (based on Linked In)....  Autopilot is all about Machine Learning / Artificial Intelligence.   I see a big gaping hole in his work experience in regards to transferrable skillset technically.... though what we are talking about his he is going to Tesla to be a VP.... and I don't know many VPs that spend there days doing much if at all anything technical.... yes technical skills are important when you are managing technical people.... but quite often technical gurus make really horrible managers.... and the higher they go the more it becomes obvious.  Not saying it is the case here, but just because a person is good in compiler design & development does not mean he will be very good at artificial intelligence.
    Maybe they wanted someone that could lead the development of complex open source AI infrastructure in a corporate environment.

    Musk created OpenAI and Universe.  Part of Lattners duties may be to guide open source autopilot development while meeting corporate objectives. Something he has proven on two major open source projects (LLVM and Swift) and he has also proven that he can attract and herd open source contributors.
  • Reply 63 of 70
    1st1st Posts: 443member
    bkkcanuck said:
    The issue is that from what I can see is that he has no background Machine Learning / Artificial Intelligence at all (based on Linked In)....  Autopilot is all about Machine Learning / Artificial Intelligence.   I see a big gaping hole in his work experience in regards to transferrable skillset technically.... though what we are talking about his he is going to Tesla to be a VP.... and I don't know many VPs that spend there days doing much if at all anything technical.... yes technical skills are important when you are managing technical people.... but quite often technical gurus make really horrible managers.... and the higher they go the more it becomes obvious.  Not saying it is the case here, but just because a person is good in compiler design & development does not mean he will be very good at artificial intelligence.

    (1) machine learning and AI is not really that difficult.  especially, the new stuff come out like this: http://www.modha.org/papers/013.CICC2.pdf?cm_mc_uid=82028602362914842391353&cm_mc_sid_50200000=1484239135
    IBM is looking for high school back ground as minimum requirement in some positions.  in cutting edge or bleeding edge, no background sometimes is a blessing - easy to learn the new stuff.
    (2) VP just recently - start from 90s become non-tech (not all, but I agree with you, majority, with MBA under the belt).  The few I have luck to work with/under are very technical - know the product inside out, can pull a budget/cost based in a second with ROM +/- 10%, and schedule launch (one of them doing world wide) without delay with risk factored in.  etc.  those are Sr. white head engineer worked many years in different dept, knowing not only inhouse data, but also the competitor (down to the plant daily output of other company, even it is estimate, it is amazing).  Those are the guys 5 years away from their retirement... (not on the facebook "young guys just smarter" list).  Not sure if he would be VP of what kind, but at least is not MBA type ;-).  as long as he follow the foot step of Bell lab (idea factory - book) or Lockheed Martin (Skunk work), it will be fine ;-).  Most likely, the chaps with full confidence will do well, the bad Mgr are the one that fit the Peter's principle - in-secure internally... we will see.  Good for Musk. 


  • Reply 64 of 70
    gatorguygatorguy Posts: 24,213member
    It's always about Ego.
    What? He was with Apple for over a decade, I'm sure it's about stretching his wings and working on his career -- if I were offered a VP spot I'd leave my current job too. 
    From another report:
    "The surprise decision of Swift creator and long-time Xcode lead Chris Lattner to leave Apple was in large part driven by his frustration with the culture of secrecy at the company, say developer friends."
  • Reply 65 of 70
    1st said:
    (1) machine learning and AI is not really that difficult.  especially, the new stuff come out like this: http://www.modha.org/papers/013.CICC2.pdf?cm_mc_uid=82028602362914842391353&cm_mc_sid_50200000=1484239135
    IBM is looking for high school back ground as minimum requirement in some positions.  in cutting edge or bleeding edge, no background sometimes is a blessing - easy to learn the new stuff.
    (2) VP just recently - start from 90s become non-tech (not all, but I agree with you, majority, with MBA under the belt).  The few I have luck to work with/under are very technical - know the product inside out, can pull a budget/cost based in a second with ROM +/- 10%, and schedule launch (one of them doing world wide) without delay with risk factored in.  etc.  those are Sr. white head engineer worked many years in different dept, knowing not only inhouse data, but also the competitor (down to the plant daily output of other company, even it is estimate, it is amazing).  Those are the guys 5 years away from their retirement... (not on the facebook "young guys just smarter" list).  Not sure if he would be VP of what kind, but at least is not MBA type ;-).  as long as he follow the foot step of Bell lab (idea factory - book) or Lockheed Martin (Skunk work), it will be fine ;-).  Most likely, the chaps with full confidence will do well, the bad Mgr are the one that fit the Peter's principle - in-secure internally... we will see.  Good for Musk.  
    We are not talking about using a few AI libraries, or being basically an glorified operator in assisting a learning algorithm (assisted learning).  We are talking about fundamental research and development where you are involved in ground-braking work that will eventually lead to advances that can be deployed in future products.  If it were simply a matter of pumping out code, or using libraries we would all be in self-driving cars right now.
    I have taken some of the courses related to this ... mostly introductory stuff and I would not say that it is easy etc.... basically since it has been a long time since I had been in school and had to relearn some math.... that I have rarely used or even some new stuff.  And all this is just to learn some introductory stuff not to the point where I would feel comfortable being thrown into AI research and development projects.  If it were so easy there would not be such a shortage in the field when it came to hiring.... any joe out of high school as you said would be able to fill the position.
  • Reply 66 of 70
    1st1st Posts: 443member
    bkkcanuck said:
    .. any joe out of high school as you said would be able to fill the position.
    I am not saying IBM is hiring average joe, high school or otherwise... (Bill and Steve were both high school way back...). 
  • Reply 67 of 70

    Lockheed Engineer/Whistleblower - Driverless vehicles are not being designed or tested properly. The public should not be used as Guinea pigs. There is a far safer and faster route.

    Additionally - Mr. Lattner is not remotely qualified to lead this effort.  He does not have the domain nor engineering best practice experience.

    More on these here - https://www.linkedin.com/pulse/nhtsa-should-shut-down-all-auto-piloted-self-driving-cars-dekort?trk=mp-author-card






  • Reply 68 of 70
    gatorguy said:
    It's always about Ego.
    What? He was with Apple for over a decade, I'm sure it's about stretching his wings and working on his career -- if I were offered a VP spot I'd leave my current job too. 
    From another report:
    "The surprise decision of Swift creator and long-time Xcode lead Chris Lattner to leave Apple was in large part driven by his frustration with the culture of secrecy at the company, say developer friends."
    Should be noted that Lattner himself tweeted a reply to Rene Ritchie (of iMore) that those rumors/reports were attempts to make Apple look bad. So, I take it there's little credibility to those "developer friends" stories.
  • Reply 69 of 70
    imispgh said:

    Lockheed Engineer/Whistleblower - Driverless vehicles are not being designed or tested properly. The public should not be used as Guinea pigs. There is a far safer and faster route.

    Additionally - Mr. Lattner is not remotely qualified to lead this effort.  He does not have the domain nor engineering best practice experience.

    More on these here - https://www.linkedin.com/pulse/nhtsa-should-shut-down-all-auto-piloted-self-driving-cars-dekort?trk=mp-author-card


    I read through the link and some minor issues is that he seems to be coming at it as a structured conditional approach where you can diagram everything in simple diagrams (flow diagrams, use cases).  From what little I know the current "rage" in Artificial Intelligence is "Deep Learning".... where you have a starting point containing formulas etc. then feed it lots and lots of information from which the "black box" learns from.   The problem from a structured programming point of view is that for the most part we actually don't know in detail what it is thinking....  and if you tried opening the black box and diagramming it.... it would probably not be followable by any human since it is building a massive network based on statistics.  It would not even be binary (0% or 100%) it would be all the values in between (i.e. not simple if then else flow).

    In fact if you did not allow the cars to drive on public roads, but only in controlled situations.... which means that you would effectively create a situation where you would never be able to have the car learn from real life events.  The cars are not driving around totally autonomous right now, they have a driver in the car..... helping the cars learn when they get into situations were they get confused.  

    As far as not using the public as guinea pigs....  the public has always been used as guinea pigs... at least in all the jurisdictions that I have lived (though Japan might be different).   The process of getting a driver on the road was basically giving them a 60 page book with some basics to learn for written tests....  basically these are the rules (formulas).... (i.e. signs, what to do at a four point intersection etc.).... you write a test and then you get a certificate.  From that point on the public has always been a guinea pig.  The "learning" driver gets in a car with an instructor who can take control of the car if the student gets confused or makes a horrible mistake, but effectively we have an untrained driver on the public roads and everyone else on the road is already a guinea pig.  This is no different than what is being done with "autonomous" cars right now.

    What is needed is not the fact that the autonomous driving vehicle is completely diagramed and that every exception documented, what we need to be assured of before we allow them free run of the roads is that statistically on average that they are safer than the human drivers and that when they do fail that they are no worse than the failure of a human driver in outcome.

    Once the autonomous cars are on road there will of course be a transition period where we have basically the current infrastructure and both driver and driverless cars on the roads.  Each year after you will see more of those vehicles on the road and the infrastructure design for those roads and bridges will start to change to the needs of the autonomous cars to improve the situation for the autonomous cars.  At a certain point though the biggest danger to those autonomous cars will be those with drivers.... and it will eventually become illegal to actually drive your car on public roads.  

    In fact the quality of your average driver is beginning to drop dramatically as more distractions than ever are having an impact.  There have always been drivers making poor decisions when driving and falling prey to distractions (before it was eating, drinking and getting prepared for work -- now it is messaging and even conversations where you actually having to think affecting the cognitive abilities of drivers.  Most drivers fail to understand that they are effectively operating heavy machinery and believe that they can do more than one thing at a time -- sometimes even as simple as having a conversation with a passenger where the conversation is not "how is the weather" but forcing the driver to think and multi-task.... which we as humans are very poor at.


  • Reply 70 of 70
    bkkcanuckbkkcanuck Posts: 864member
    Earlier I mentioned I had concerns that I thought Chris Lattner, while great with LLVM and compiler tools, was not a great fit for AI/Machine Learning at Tesla....

    Looks like Tesla has fired Chris after 6 months because he was not a great fit.  I would have preferred being wrong....  but this time (maybe one of the few) ... looks like I was right....
Sign In or Register to comment.