Without arguing the merits of the Altera investment or divestment, a common pattern for Intel seems to be a wild see-sawing between an aggressive and a defensive market posture - it’s a regular occurrence for Intel to announce a bold new venture to try to claim some new territory, and just as regular that they announce they’re halting that venture in the name of “consolidating” and “focusing on their core.” The consequence is that they never give new ventures time to actually succeed, so they just bleed money creating things they murder in the cradle, and nobody born before last Tuesday is investing in bothering to learn the new Intel thing because its expected lifespan is shorter than the average Google product.
Intel either needs to focus or they need to be bold (and I’d actually prefer they be bold - they’ve started down some cool paths over time), but what they really need is to make up their goddamn minds and stop panicking every other quarter that their “ten-year bets” from last quarter haven’t paid off yet.
Speaking from personal experience, many director-level and above positions at Intel, especially in growth related areas are filled through nepotism and professional connections. I've never seen a headline about Intel’s decline and thought, 'Wow, how could that happen?'
I had a business partner that I agreed on a lot of things with but not about Intel. My assumption was that any small software package from Intel, such as a graph processing toolkit, was trash. He thought they could do no wrong.
Intel really is good at certain kinds of software like compilers or MKL but my belief is that organizations like that have a belief in their "number oneness" that gets in their way of doing anything that it outside what they're good at. Maybe it is the people, processes, organization, values, etc. that gets in the way. Or maybe not having the flexibility to know that what is good at task A is not good at task B.
I saw always intel as a HW company making terribly bad SW. Anywhere I saw intel SW I would run away. Lately I used a big open source library from them, which is standard in the embedded space. Work great, but if you look the code you will be puking for a week.
In my experience Intel's WiFi and Bluetooth drivers on Linux are, by far, the best. They're reliably available on the latest kernel and they actually work. After having used other brands on Linux, I have no intention of getting non-intel WiFi or Bluetooth any time soon. The one time that I found a bug, emailing them about it got me in direct contact with the developers of the driver.
I had a different non-Intel WiFi card before where the driver literally permanently fried all occupied PCIe slots -- they never worked again and the problem happened right after installing the driver. I don't know how a driver such as this causes that but it looks like it did.
Yes, their open source drivers had a painful birth, but they are good once they're sanded and sharpened with the community.
However, they somehow managed to bork e1000e driver in a way that certain older cards sometimes fail to initialize and require a reboot. I have been bitten by the bug, and the problem was fixed later by reverting the problematic patch in Debian.
I don't know current state of the driver since I passed the system on. Besides a couple of bad patches in their VGA drivers, their cards are reliable and works well.
From my experience, their open source driver quality does not depend on the process, but on specific people and their knowledge and love for what they do.
I don't like the aggressive Intel which undercuts everyone by shady tactics, but I don't want them to wither and die, either, but seems like their process, frequency and performance "tricks" are biting them now.
Interesting. Does Bluez fall under that umbrella?
I have found bluez by far the hardest stack to use for Bluetooth Low Energy Peripherals. I have used iOS’s stack, suffered the evolution of the Android stack, used the ACI (ST’s layer), and finally done just straight python to the HCI on pi. Bluez is hands down my least favorite.
that's only because their hardware is extremely simple.
so the driver have little to screw up. but they still manage to! for example, the pci cards are all broken, when it's literary the same hardware as the USB ones.
The team working on their Realsense depth cameras was doing great work on the SDK, in my opinion.
Frequent releases, GitHub repo with good enough user interaction, examples, bug fixing and feedback.
This is oddly specific. Can you share the exact Intel software toolkit?> such as a graph processing toolkit
Why does this not affect NVidia, Amazon, Apple, or TSMC?> "number oneness"
The affliction he’s imputing is born of absolute dominance over decades. Apple has never had the same level of dominance, and NVidia has only had it for two or three years.
It could possibly come to haunt NVidia or TSMC in decades to come.
A friend who developed a game engine from scratch and is familiar with inner workings and behavior of NVIDIA driver calls it an absolute circus of a driver.
Also, their latest consumer card launches are less then stellar, and the tricks they use to pump up performance numbers are borderline fraud.
As Gamers Nexus puts it "Fake prices for fake frames".
My response is somewhat tangential: When I look at GPUs strictly from the perspective of gaming performance, the last few generations have been so underwhelming. I am not a gamer, but games basically look life-like at this point. What kind of improvements are gamers expecting going forward? Seriously, a mid-level GPU has life-like raytracing at 4K/60HZ. What else do you need for gaming? (Please don't read this as looking down upon gaming; I am only questioning what else gamers need from their GPUs.)
To me, the situation is similar with monitors. After we got the pixel density of 4K at 27 inches with 60Hz refresh rate (enough pixels, enough inches, enough refresh rate), how can it get any better for normies? Ok, maybe we can add HDR, but monitors are mostly finished, similar to mobile phones. Ah, one last one: I guess we can upgrade to OLED when the prices are not so scandalous. Still, for the corporate normies, who account for the lion's share of people siting in front of 1990s-style desktop PCs with a monitor, they are fine with 4K at 27 inches with 60Hz refresh rate forever.
I can't answer the first part, since I'm not playing any modern games, but continuously visit RTS games like C&C & Starcraft series.
However, I can talk about monitors. Yes, a 27" 4K@60 monitor is really, really good, but panel quality (lighting, uniformity and color correctness) goes a long way. After using Dell and HPs "business" monitors for so long, most "normal monitors for normies" look bad to me. Uncomfortable with harsh light and bad uniformity.
So, the monitor quality is not "finished" yet. I don't like OLEDs on big screens, because I tend to use what I buy for a very long time, and I don't wany my screen to age non-uniformly, esp. if I'm looking to it everyday and for long periods of time.
Is OLED burnout still a thing? If yes, then you are probably right: Normies will not upgrade to OLED until that issue is fixed, or a new technology replaces it.
[dead]
See the funny thing is, even with all of this stuff about Intel that I hear about (and agree with as reported), I also just committed a cardinal sin just recently.
I'm old, i.e. "never buy ATI" is something that I've stuck to since the very early Nvidia days. I.e. switched from Matrox and Voodoo to Nvidia while commiserating and witnessing friend's and colleagues ATI woes for years.
The high end gaming days are long gone, even had a time of laptops where 3D graphics was of no concern whatsoever. I happened to have Intel chips and integrated graphics. Could even start up some gaming I missed out on during the years or replay old favourites just fine as even a business laptop Intel integrated graphics chip was fine for it.
And then I bought an AMD based laptop with integrated Radeon graphics because of all that negative stuff you hear about Intel and AMD itself is fine, sometimes even better, so I thought it was fair to give it a try.
Oh my was that a mistake. AMD Radeon graphics is still the old ATI in full blown problem glory. I guess it's going to be another 25 years until I might make that mistake again.
It's a bummer you've had poor experiences with ATI and later AMD, especially on a new system. I have an AMD laptop with Ryzen 7 7840U which includes a Radeon 780M for integrated graphics and it's been rock solid. I tested many old and new titles on it, albeit at medium-ish settings.
What kind of problems did you see on your laptop?
Built a PC with a top-of-the line AMD CPU, it's great. AMD APUs are great in dedicated gaming devices like the XBOX ONE, PS 4 and 5 and Steam Deck.
On the other hand I still think of Intel Integrated GPU in "that thing that screws up your web browser chrome of if you have a laptop with dedicated graphics"
Not tharkun__:
AMD basically stopped supporting (including updating drivers) for GPUs before RDNA (in particular GCN), while such GPUs were still part of AMD's Zen 3 APU offerings.
Well back when, literally 25 years ago, when it was all ATI, there were constant driver issues with ATI. I think it's a pretty well known thing. At least was back when.
I did think that given ATI was bought out by AMD and AMD itself is fine it should be OK. AMD always was. I've had systems with AMD CPUs and Nvidia GPUs back when it was an actual desktop tower gaming system I was building/upgrading myself. Heck my basement server is still an AMD CPU system with zero issues whatsoever. Of course it's got zero graphics duties.
On the laptop side, for a time I'd buy something with discrete Nvidia cards when I was still gaming more actively. But then life happened, so graphics was no longer important and I do keep my systems for a long time / buy non-latest gen. So by chance I've been with Intel for a long time and gaming came up again, casually. The Intel HD graphics were of course totally inadequate for any "real" current gaming. But I found that replaying some old favs and even "newer" games I had missed out on (new as in, playing a 2013 game for the very first time in 2023 type thing) was totally fine on an Intel iGPU.
So when I was getting to newer titles, the Intel HD graphics no longer cut it but I'm still not a "gamer" again, I looked at a more recent system and thought I'd be totally fine trying an AMD system. Exactly like another poster said, "post 2015 should be fine, right?! And then there's all this recent bad news about Intel, this is the time to switch!".
Still iGPU. I'm not going to shell out thousands of dollars here.
And then I get the system and I get into Windows and ... everything just looks way too bright, washed out, hard to look at. I doctored around, installed the latest AMD Adrenalin driver, played around with brightness, contract, HDR, color balance, tried to disable the Vari-Brightness I read was supposed to be the culprit etc. It does get worse once you get into a game. Like you're in Windows and it's bearable. Then you start a game and you might Alt-Tab back to do something and everything is just awfully weirdly bright and it doesn't go away when you shut down the game either.
I stuck with it and kept doctoring for over 6 months now.
I've had enough. I bought a new laptop, two generations behind with an Intel Iris Xe for the same amount of money as the ATI system. I open Windows and ... everything is entirely totally 150% fine, no need to adjust anything. It's comfortable, colors are fine, brightness and contrast are fine. And the performance is entirely adequately the same as with the AMD system. Again, still iGPU and that's fine and expected. It's the quality I'm concerned with, not the performance I'm paying for. I expect to be able to get proper quality software and hardware even if I pay for less performance than gamer kid me back when was willing to.
> And then I get the system and I get into Windows and ... everything just looks way too bright, washed out, hard to look at.
I've seen OEMs do that to an Intel+NVIDIA laptop, too. Whatever you imagine AMD's software incompetence to be, PC OEMs are worse.
It's Lenovo. FWIW, one thing I really didn't like much either was that I found out that AMD really tries to hide what actual GPU is in there.
Everything just reports it as "with Radeon graphics", including benchmarking software, so it's almost impossible to find anything about it online.
The only thing I found helped was GPU-Z. Maybe it's just one of the known bad ones and everything else is fine and "I bought the one lemon from a prime steak company" but that doesn't change that my first experience with the lemon company turned prime steak company is ... another lemon ;)
It's a Lucienne C2 apparently. And again, performance wise, absolute exactly as I expected. Graphics quality and AMD software? Unfortunately exactly what I expected from ATI :(
And I'm not alone when I look online and what you find online is not just all Lenovo. So I do doubt it's that. All and I mean all my laptops I'm talking about here were Lenovos. Including when they were called IBM ThinkPads and just built by Lenovo ;)
Laptops have really gone to hell in the past few years. IMO the only sane laptop choices remaining are Framework and Apple. Every other vendor is mess, especially when it comes to properly sleeping when closing the lid.
I bought an AMD Ryzen Thinkpad late last year, and I had the same issue with bright/saturated colours. I fixed it by running X-Rite Color Assistant which was bundled with the laptop, and setting the profile to sRGB. I then turned up the brightness a little.
I think this a consequence of the laptop having HDR colour, and the vendor wanting to make it obvious. It's the blinding blue LED of the current day.
Yeah, I read HDR might be the issue. Didn't know X-Rite and did not come with the laptop, but did play with disabling / trying to adjust HDR, making sure sRGB was set etc. Did not help. Also ran all the calibrations I could find for gamma, brightness and contrast many many times to try and find something that was better.
What I settled on for quite some time was manually adjusted color balance and contrast and turning the brightness down. That made it bearable but especially right next to another system, it's just "off" and still washed out.
If this was HDR and one can't get rid of it, then yeah agreed, it's just bad. I'm actually surprised you'd turn the brightness up. That was one of the worst things to do, to have the brightness too high. Felt like it was burning my eyes.
So long story short...
You don't like current AMD systems because one of them had an HDR screen? Nothing to do with CPU/GPU/APU?
If the diagnosis is that AMD GPUs can't do HDR properly then yes. There was not a single setting anywhere in Windows itself nor the Adrenalin driver software that allowed me to configure the screen to a comfortable setting. Even when specifically trying to disable anything HDR related.
My work Macbook on the other hand has zero issues with HDR and its display.
To be fair, you can still blame the OEM of course but as a user I have no way to distinguish that, especially in my specific situation.
I think I found X-Rite by just searching for color with the start menu.
Before I used that tool, I tried a few of the built-in colour profiles under the display settings, and that didn't help.
I had to turn the brightness up because when the display is in sRGB it gets dimmer. Everything is much more dim and muted, like a conventional laptop screen. But if I change it back to say, one of the DICOM profiles, then yeah, torch mode. (And if I turn the brightness down in that mode, bright colours are fine but dim colours are too dim and everything is still too saturated).
[dead]
Did you time travel from 2015 or something? Haven't heard of anyone having AMD issues in a very long time...
I’ve been consistently impressed with AMD for a while now. They’re constantly undervalued for no reason other than CUDA from what I can tell.
AMD is appropriately valued IMO, Intel is undervalued and Nvidia is wildly overvalued. We're hitting a wall with LLMs, Nvidia was at one point valued higher than Apple which is insane.
Also CUDA doesn't matter that much, Nvidia was powered by intense AGI FOMO but I think that frenzy is more or less done.
What?!
Nvidia is valuably precisely because the software, which is also why AMD is not so valuable. CUDA matters a lot (though that might become less true soon). And Nvidia's CUDA/software forward thinking most certainly predated AGI FOMO and that is the CAUSE of them doing so well with this "AI boom".
It's also not wildly overvalued, purely on a forward PE basis.*
I do wonder about the LLM focus, specifically whether we're designing hardware too much for LLM at the cost of other ML/scientific computing workflows, especially the focus on low precision ops.
But.. 1) I don't know how a company like Nvidia could feasibly not focus on designing for LLM in the midst of this craziness and not be sued by shareholders for negligence or something 2) they're able to roll out new architectures with great improvements, especially in memory, on a 2 year cycle! I obviously don't know the counterfactual, but I think without the LLM craze, the hypothetical generation of GPU/compute chips would be behind where they are now.
I think it's possible AMD is undervalued. I've been hoping forever they'd somehow catch up on software. They do very well in server business, and if Intel continues fucking up as much as they have been, AMD will own CPU/servers. I also think what deepseek has done may convince people it's worth it programming closer to the hardware, somewhat weakening Nvidias software moat.
*Of course, it's possible I'm not discounting enough for the geopolitical risk.
> It's also not wildly overvalued, purely on a forward PE basis.*
Once you start approaching a critical mass of sales, it's very difficult to keep growing it. Nvidia is being valued as though they'll reach a trillion dollars worth of sales per year. So nearly 10x growth.
You need to make a lot of assumptions to explain how they'll reach that, versus a ton of risk.
Risk #1: arbitrage principle aka. wherever there's profit to be made other players will move in. AMD has AI chips that are doing quite well, Amazon and Google both have their own AI chips, Apple has their own AI chips... IMO it's far more likely that we'll see commodification of AI chips than that the whole industry will do nothing and pay Nvidia's markup. Especially since TSMC is the one making the chips, not Nvidia.
Risk #2: AI is hitting a wall. VCs claim is isn't so but it's pretty obvious that it is. We went from "AGI in 2025" to AI companies essentially adding traditional AI elements to LLMs to make then useful. LLMs will never reach AGI, we need another technological breakthrough. Companies won't be willing to keep buying every generation of Nvidia chip for ever-diminishing returns.
Risk #3: Geopolitical, as you mentioned. Tariffs, China, etc...
Risk #4: CUDA isn't a moat. It was when no one else had the incentive to create an alternative and it gave everyone on Nvidia a head start. But now everything runs on AMD now too. Google and Amazon have obviously figured out something for their own accelerators.
The only way Nvidia reaches enough revenue to justify their market cap is if Jensen Huang's wild futuristic predictions become reality AND the Googles, Amazons, Apples, AMDs, Qualcomms, Mediateks and every other chip company all fail to catch up.
What I see right now is AI hitting a wall and the commodification of chip production.
Not really. I don't want to just re-paste everything, but basically this: https://news.ycombinator.com/item?id=43688088 where I also sort of address your 2015 mention here.
Ah, Windows OEM nonsense...
I've used Linux exclusively for 15 years so probably why my experience is so positive. Both Intel and AMD are pretty much flawless on Linux, drivers for both are in the kernel nowadays, AMD just wins slightly with their iGPUs.
Yet my AMD APU was never properly supported for hardware video decoding, and could only do up to OpenGL 3.3, while the Windows 10 driver could go up to OpenGL 4.1.
Weird. Was it pre-Zen?
I had a Ryzen 2700u that was fully supported, latest OpenGL and Vulkan from day 1, hardware decoding, etc... but on Linux.
Meanwhile PC gamers have no trouble using their AMD GPUs to play Windows games on Linux.
That's actually something I have not tried at all again yet.
Back in the day, w/ AMD CPU and Nvidia GPU, I was gaming on Linux a lot. ATI was basically unusable on Linux while Nvidia (not with the nouveau driver of course), if you looked past the whole kernel driver controversy with GPL hardliners, was excellent quality and performance. It just worked and it performed.
I was playing World of Warcraft back in the mid 2000s via Wine on Linux and the experience was actually better than in Windows. And other titles like say Counter Strike 1.5, 1.6 and Q3 of course.
I have not tried that in a long time. I did hear exactly what you're saying here. Then again I heard the same about AMD buying ATI and things being OK now. My other reply(ies) elaborate on what exactly the experience has been if you're interested.
Can’t say what your experience with your particular box will be, but the steam deck is absolutely fantastic.
I wish I had an AMD card. Instead our work laptops are X1 extremes with discrete nvidia cards and they are absolutely infuriating. The external outputs are all routed through the nvidia card, so one frequently ends up with the fan blowing on full blast when plugged into a monitor. Moreover, when unplugging the laptop often fails to shutdown the discrete graphics card so suddenly the battery is empty (because the discrete card uses twice the power). The Intel card on the other hand seems to prevent S3 sleep when on battery, i.e. the laptop starts sleeping and immediately wakes up again (I chased it down to the Intel driver but couldn't get further).
And I'm not even talking about the hassle of the nvidia drivers on Linux (which admittedly has become quite a bit better).
All that just for some negligible graphics power that I'm never using on the laptop.
That’s not specific to Intel though. That’s how Directors and above are recruited in any big company.
For example, Uber hired a VP from Amazon. And the first thing he did was to hire most of his immediate reports at Amazon to Director/Senior Director positions at Uber.
At that level of management work gets done mostly through connections, favors and networking.
I tell people that if they get a new boss who is at Director or above, assume that you are re-interviewing for your job for the first 6 months with the new boss.
Major companies like that become infected with large hierarchies of scum sucking middle management that eat revenue with bonuses.
Of course they are obsessed with shrinking labor costs and resisting all downsizing until it reaches comical levels.
Take a company like health insurance that can't show a large dividend because it would be a public relations disaster. Filled to the gills with vice presidents to suck up extra earnings. Or medical devices.
Software is also very difficult for these hierarchies of overpaid management, because you need to pay labor well to get good software, and the only raison d'etre of these guys is wage suppression.
Leadership is hard for these managers because the primary thing rewarded is middle management machiavellianism, turf wars, and domain building, and any visionary leadership or inspiration is quashed.
It almost fascinates me that large company organizations basically are like Soviet style communism, Even though there are opportunities for internal competition. Like data centers and hosting and it groups. They always need to be centralized for" efficiency".
Meanwhile, they are like 20 data centers and if you had each of them compete for the company's internal business, they'd all run more efficiently.
probably because continuous competition is inefficient within an organization and can cause division/animosity between teams?> It almost fascinates me that large company organizations basically are like Soviet style communism, Even though there are opportunities for internal competition.
"within an organization and can cause division/animosity between teams"
Are you aware of what goes on in middle management? This is the normal state of affairs between managers.
If what you are saying is true, then .......
Why is there competition in the open marketplace? You have just validated my suggestion that internally companies operate like communists.
i am not an expert, but i think the theory of competition leading to better outcomes in a marketplace is the availability of alternatives if one company went bad (in addition to price competition etc)> Why is there competition in the open marketplace? You have just validated my suggestion that internally companies operate like communists.
inside a company you are working for the same goal "against" the outside, so its probably more an artifact of how our economy is oriented
i'd guess if our economy was oriented around cooperation instead of "competition" (while keeping alternatives around) that dichotomy might go away...
just some random thoughts from an internet person
> it’s a regular occurrence for Intel to announce a bold new venture to try to claim some new territory, and just as regular that they announce they’re halting that venture in the name of “consolidating” and “focusing on their core.” [...] [Intel's new thing's] expected lifespan is shorter than the average Google product.
You got there in the end. You get the same outcome with the same corporate incentive.
Both Intel and Google prioritize {starting something new} over {growing an existing thing}, in terms of corporate promotions and rewards, and therefore employees and leaders self-optimize to produce the repeated behavior you see.
The way to fix this would be to decrease the rewards for starting a new thing and increase the rewards for evolving and growing an existing line of business.
I cannot speak for Intel, but Google has done very well by "growing an existing thing" in AdWords and YouTube. Both account for the lion's share of profits. They are absolute revenue giants. Many have tried, and failed to chip away at that lead, but Google has managed to adapt over and over again.> Both Intel and Google prioritize {starting something new} over {growing an existing thing}, in terms of corporate promotions and rewards, and therefore employees and leaders self-optimize to produce the repeated behavior you see.
It is the only two things that google has regularly maintained, one of which with one of the biggest moats (youtube, the to go video service), and the other connected to the homepage of the internet.
It's really hard to fuck these things up. Which they have been trying hard, given the state of youtube and the search engine.
rewards for not fucking up an existing (monopoly) line of business
I can see why you have to be "special" to work at these places.
It's similar to sales vs dev in software. Sales are always prioritizing new features to attract new users instead of fixing the known issues that are pissing off your current users.
New feature attracts new users and allows for fancy press releases. Nobody cares about press releases about an existing product getting a bug fix are become more stable.
Our society is nothing but "ooh look, shiny!" type of short attention span
But, well, it was a ten-year bet: Altera was acquired in 2015.
If they could not figure how to make it profitable, maybe somebody else should try. (Of course I don't think that the PE company is going to do just that.)
It was a ten-year bet, but they spent the first several years actively sabotaging Altera by trying to move their whole product stack over to non-functional Intel fabs.
...and the majority of their internal development systems they used for all their chip design and layout.
Doesn't purchase by a PE company pretty much guarantee the death of it? At least the selling off of the most profitable parts and pieces? Has there ever been a story of a PE purchase and the company grew under the new owner?
PE’s buy companies to increase the company’s value then sell it. There’s been many successes. Powerschool, Hilton, Dunkin’ Brands, Dollar General, Beats by Dre, Petco, GoDaddy, BJ’s Wholesale Club, Neiman Marcus, Panera Bread, Allegro, Guitar Center, Nielsen, McAfee…
Most of these have very serious issues, especially with regards to labor violations and general treatment of employees.
> McAfee
I wouldn't call that a roaring success. Funnily enough, Intel played a major role in running McAfee into the ground.
With proper leadership, McAfee could've ended up in the position CrowdStrike is now.
Trying not to piss off the Chinese government, and in particular its intelligence services (in order to sell chips) is unfortunately not a good model for an antimalware business.
Silver Lake took Dell private.
M&A churn is a way for management to monetize their power. Efficacy is a distant second concern.
How does management benefit from M&As? Sorry if this is a basic question. Do executives get paid based on the number of acquisitions?
Two ways:
Bonuses by juicing revenue numbers
Bigger next job by doing M&A and having really good-looking resume and interview story.
See item #6 here
https://corporatefinanceinstitute.com/resources/valuation/mo...
> Additionally, managers may prefer mergers because empirical evidence suggests that the size of a company and the compensation of managers are correlated.
Yeah, that's where my mind went. Executive and upper management salaries seem to be a function of revenue, not profit.
A lot of compensation works that way, to be honest. First order is that you get a percentage of whatever river of money you sit close to, regardless of effort or skill.
Esp when you are talking about software. Revenue means you have a customer that is locked up. Once you are ready to get profit, reduce costs/jack up prices and profit comes rolling in.
If I hold stock in a company, then my company acquires that company, the stock rises, and I liquidate my position in it after 6 months or whatever the cool-down period is, is this considered insider trading?
If you hold stock in company A, and your current company B acquires company A, that's not insider trading if you already owned the stock in company A before you had any information that company B was going to make that decision.
It is, however, a conflict of interest for you to be involved in company B's acquisition of company A (e.g. influencing company B to buy company A), and might even rise to the level of a breach of your fiduciary duty to company B.
I know a woman who was part of a M&A team. On her first day, she was told her days of owning individual stocks in the industry were over. She could only purchase aggregate funds. Although, I do wonder if the same rules apply to the VPs who actually have to sign off on the deals.
Insider trading is all about information held by "insiders", not about who owns what. So it would depend on whether you know something material and nonpublic when you liquidate your position (e.g., you know the acquisition is going terribly and the acquiring company is going to write it off).
This seems to be common for corporate America in general. I used to work at a YC startup. We kiiiiiinda maaaaaaaybe ran out of money (not my department) and happened to get bought by a large investor that also happens to be a US-based hardware manufacturer. Two years and countless reorgs later, they laid everyone off and as far as I know, are no longer in the business of selling the software products they bought. They never figured out how software worked, never had anyone managining the division for more than 6 months, and got bored. I think they thought by moving everyone over to Microsoft Word and Windows laptops (peppered with a half-hearted threat about RTO), they would just magically make billions of dollars the first month. It didn't happen.
I am beginning to think M&A are just some sort of ego thing for bored megacorp execs, rather than serious attempts to add efficiency and value to the marketplace. (Prove me wrong, bored megacorp execs. I'll wait.)
Having been through a few acquisitions myself, I think there is a perverse incentive where buying and destroying any competition (real or imagined) leads to positive enough outcomes that it doesn't matter if the underlying asset is destroyed. Nobody would come out and say that, but when an acquisition is tossed aside there may not be enough repercussions to prevent it from happening again.
Intel bought a drone company that was producing the only drone that was good enough for my real estate inspection company to use. They acquired it and then killed it a year or two after. The inspection industry didn't have a proper drone for years after that until DJI started getting serious about it and produced the M30E.
It was just senseless, Intel doesn't have real or imagined competition from a drone company, it wasn't even close to being in the same market. They just believed the hype about drones being the next big thing and when they found out they were too early they decided they didn't have the patience to wait for drones to become a thing and they killed it. There was no long term vision behind it.
The high-end Falcon models were an engineering marvel and, as you say, nothing else in the market was even close.
I don't know about "real estate inspection", but another use case was for them to be used in oil rigs in the North Sea to inspect the structure of the rig itself. They had to be self-stabilizing under high winds and adverse weather conditions, and they had to carry a good enough camera to take detailed photos.
Unfortunately, while the technology was there, the market wasn't. Not many wanted to get a $35K drone to be able to sustain this business.
Exactly, but surely they weren't actually losing money? Why not keep the business afloat selling expensive drones to specialty companies until the broader market picked up as they envisioned it would when they bought the company. I think we paid more like $25k for our Falcons, though buying them wasn't on my side of the company. We would have gladly pay $35k for a next gen Falcon if they ever made one. Now DJI makes good enough drones for less than $10k, but there's a chance we still would've went with Intel just we could tell our customers we weren't flying Chinese drones.
In real estate inspection, we had the same sort of concerns, can't fly too close to the object for safety reasons, and we need high resolution photos to determine quality of the masonry, paintwork and roofing etc..
- [deleted]
Wow, this post is really specific. What special hardware is required on a drone for "real estate inspection"?
The company is still thriving, you can check out their website to find out more about what real estate inspection is about (in The Netherlands): https://www.aeroscan.nl/
I believe it could be the weight of the camera and lens you would desire for good looking photos (think Sony a7 size). Good looking photos sell houses.
EDIT I just noticed the “inspection” part. Maybe they wanted good zoom to spy on the tenants? (Or maybe that’s a really uncharitable take).
To me, high quality photos for real estate inspection means (e.g.) being able to take high resolution photos of a specific part of the roof so you can understand why there’s a leak. Not having to climb is a big deal!
This is one of the main reasons we added anti monopoly provisions to our laws more than 100 years ago. Market dominance is a recognized factor in allowing this inversion of rewards to occur.
That's the face of it. Labor is a market as well. The impacts of these arrangements on our labor pool is extraordinary. It's a massive displaced cost of allowing these types of mergers to occur born out by the people who stand to gain the least from the merging of business assets.
> I am beginning to think M&A are just some sort of ego thing for bored megacorp execs
It seems like a low risk effort to put a promising inexperienced exec in charge of a recent acquisition.
If they're a screw up and run it into the ground, imagine how much damage they could have done in a megacorp position of power.
Megacorp saved (at the cost of a smaller company)
- [deleted]
Is Intel still a mega corporation? That seems to be the real problem for Intel. Becoming prey.
And Intel's acquisitions kill off promising startups. At least Altera is being sort of spun off instead of outright destroyed.
My personal theory is that desktop / laptop / server x86 (usually) is such a giant money printer that a) Intel can invest in anything (Altera, antivirus, Optane...) but b) when they do, they quickly realise that this isn't a giant profit margin machine like x86, so why bother?
They fuck their customers when they do that. A good friend of mine had a product designed around Quark that was about to go into production when Intel pulled the rug out from under him.
I worked for a former Fortune 300 company that had an active internal investment strategy. They wanted the next billion dollar business, guaranteed, in 12 months. And wanted to invest more than 1 million dollars. Sadly they are now bankrupt and owned by PE.
It could just be a stock play.. Need the stock to move up? Buy a company.
Stock down again? Sell the company you bought 2 years ago.
From the top to the bottom the problem with late stage capitalism is misaligned incentives.
Edit: I wrote "the problem" and I should have written "among the many, many problems"
Seems they should read Andy Grove’s books.
> a wild see-sawing between an aggressive and a defensive market posture
tick, tock