The following is my personal opinion in regards to whether or not companies are able o claim security by obscurity. To level set all users, this can be formally defined as the reliance on the secrecy of the design or implementation as the main method of providing security for a system or component of a system.
I do think it is possible for a company like Apple to claim “security by obscurity”. This includes implementing a nonstandard format, encoding, or protocol that outsiders would not be familiar with, thus making it harder for them to find a weakness or decipher sensitive information. Apple is historically known for not gathering as much viruses as Microsoft and PC’s. But, as they became more popular, the attacks began to accrue as people realized that Apple has basic coding problems, just like other companies do. To adjust their practices to deviate from this notion, Apple can use commercial or open source solutions to help thwart hackers. The benefits of these solutions are that they have been tested over time by an extensive community of users and researchers. Open source software allows the industry to inspect the code down to the most fundamental components and identify any weaknesses. This will increase the chances that security weaknesses will be discovered and will be fixed early on.
I do not believe that any operating system in use today is “secured” solely by being “obscure”. This notion does nothing more than provide a false sense of security. It has been proven over and over again that very little effort is required to analyze and decode attempts to make data unrecognizable. The problem with a false sense of security is that sometimes the implementation of control can cause more harm than good. Most times the risk is not worth the strategy to save a buck in the name of compliance.