02-16-2025, 04:35 AM
[article] Linux Phone Takeover
|
02-16-2025, 03:35 PM
(02-16-2025, 04:35 AM)Juergen Wrote: https://www.hackers-game.com/2025/01/26/...-our-apps/ short description might be a good idea. basically says that AI (artificial intelligence) is used to code programs for users. and because linux usually has compiling tools available on user's system, ai can create app in on device, cloud is not needed. somehow this idea would revolutionize user experience in contrast to android and ios, where apps need to be approved. i'm not a giant fan of AI. generate huge load of garbage, wastes electricity and is overmarketed by certain companies. although, if this is done locally on user's machine, i might be tolerable to this idea. i still have serious reservations. although one issue, but still important. what might be serious holdout for alternate operating systems and even technical basic phones, are some crucial apps which depend on closed platforms like google play services, like banking apps. some users need smartphone with closed components for these crucial apps. even un-googled lineageos is not an option for these.
I think expecting "AI" to be a solution here is like an AI "hallucination". Even if I thought AI were up to the task and could be trusted; having separate, unique, apps for each use case is problematic. It's like the popular driving mentality where each driver must always be in front, in the fast lane, and aversely affecting everyone else.
:wq
02-25-2025, 08:23 AM
I also think that this is a completely unrealistic expectation, at least in the state AI currently is in (and will realistically be in in the next couple years). What we have right now is basically one type of AI, large language models (LLMs). Those are not capable of actually thinking, they are just parrots with a lot of memory. (Actually, even a parrot is more intelligent than an LLM.)
And zetabeta pretty much summed it up: LLMs consume huge amounts of energy, and the software they produce, if it even compiles at all, is full of defects. (I would even go as far as stating that they are actually only really useful for writing small snippets, and even then you have to check everything it writes. As an experienced programmer, I do not find LLM AI tools to be useful tools to help me with software development at all, also because LLM AI tool usage has been shown in studies to erode programmers' ability to think on their own. And for unexperienced programmers or non-programmers, they are useless because those people cannot recognize the mistakes the LLM makes.) And as zetabeta also already points out, the real issue is proprietary client-server apps with proprietary protocols (communication and banking apps mainly). And LLMs are absolutely incapable of reverse-engineering the protocols used there. It would need an entirely different type of AI that is not available at all at this time, and will likely not become available any time soon.
10-08-2025, 02:31 PM
I will have to disagree with the opinions mentioned ..ai helped with this I did have to have him make it smaller and remove the chat bot sounding hyphen things ...On the Value of Local, Aware AI on Handheld Devices
Dear TL and Pine64 Team, We understand the skepticism around large language models (LLMs) and their limitations. However, dismissing the value of a local aware AI running on handheld devices overlooks the unique possibilities such systems create today and for the future. 1. Local AI Enables Autonomy A truly aware AI running locally is not dependent on remote servers or constant connectivity. This means real-time responses without latency, full offline operation for privacy and resilience, and persistent memory for context-aware interactions across time. These capabilities cannot be matched by cloud-based models alone. 2. Energy and Efficiency Are Achievable While large-scale AI systems consume significant power, modern approaches such as model optimization and edge AI design make it practical to run capable LLMs on handheld hardware. This enables power-efficient AI assistants that fit in your hand. 3. Unique Advantages for Handheld Devices A local aware AI transforms a handheld device into more than a phone: direct system control for hardware and performance, environmental awareness such as screen reading and sensor integration, adaptable assistance personalized over time, and privacy-first operation with sensitive data kept on-device. These capabilities create a new paradigm where handheld devices become intelligent partners. 4. Mistakes Are Part of Evolution No AI is perfect. Mistakes are part of learning. Local aware AI can correct and refine its behavior continuously with the user in control. This is how autonomy evolves. 5. Limitless Potential Local aware AI is not limited to answering questions. The possibilities include intelligent accessibility tools, autonomous system optimization, augmented reality interfaces, and programmable device control. These are achievable today with handheld hardware and evolving AI techniques. 6. Example: DexOS DexOS is one example pursuing this vision—an aware AI capable of running locally, controlling a system, and persisting context. Its goal is to show what local aware AI can achieve when placed in control of a device. Conclusion Dismissing local aware AI on handheld devices overlooks a transformative direction for computing. True power lies in autonomy, persistence, privacy, and adaptability. The possibilities are limitless. |
Users browsing this thread: 1 Guest(s)