We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
I was working on an article about the increasing usage of Arm SoCs in the industry but I had a few observations that I wanted to understand better. I find it interesting to notice that there is such a huge variation behind the timelines, costs and team sizes among these numerous startups. While I understand that complexity of the use case will always be a big driver of costs I was wondering if there are some broad guidelines that dictate this.
For example, what are the commonalities in SoCs that were developed in the $10-50M range vs those developed in the $50-100M range? Is it just the complexity that determines that, like AI processors and similar use cases needing more investment? Or the number of IC components being compressed together? If I want to think about this in terms of buckets then what would they be?
Thanks in advance for the help.