Claude Code is now available to $20 per month customers

I think I mentioned elsewhere that 2025/2026 is my year of being “all in” with regard to Ai options (within reason). And as part of this strategy I have $20 per month subscriptions with Google, OpenAI and Anthropic as I’ve been more deeply researching the strengths and weaknesses of each platform.

As most of you were probably already aware, Anthropic’s Claude is head and shoulders above its competitors in terms of Ai for coding workflows. But until this last week, you had to be using at minimum, the $100 per month Max account use to Claude Code, Anthropic’s command line option for writing code “agentically”, or via an API account, which can be the most expensive way of using Claude Code.

I also have a pro $20 per month Cursor IDE account, and $100 per annum GitHub Copilot account. For those that aren’t aware, GitHub Copilot works within VS Code, and Cursor, is a custom-built version of VS Code specifically for Ai coding workflows (OpenAI recently purchased Cursor’s main competitor WinSurf). Unfortunately, GitHub Copilot can’t be used within Cursor, but most (in my case, all) third party VS Code extensions do function fully within Cursor.

Using Cursor in tandem with Claude Code (within Cursors embedded Terminal), has been the most compelling Ai assisted coding experience I’ve used thus far. And much to my surprise it’s even been useful for working with ancient languages such as Common Lisp (at a guess because Lisp was designed back in the day specifically for the creation of Symbolic Ai models, so Claude has probably been trained on plenty of Lisp materials).

The reason for posting this here is that to my surprise, the Cursor/Claude Code combo is even useful with for tackling typical Keyboard Maestro/Alfred/Raycast macro creation tasks where one can encounter a scenario where macro creation is reliant on shell scripting in combination with AppleScript.

Anthropic purpose built Claude Code as a CLI tool as it provides a platform that can be deployed with equal effectiveness, across with the widest set of user preferences, be that e.g. VIM, VS Code or one of JetBrains specialist IDE’s.

With all this in mind, a lateral thought crossed my mind with regard to DEVONthink. I don’t think it’s an unreasonable feature request for DT to provide a “Terminal” within DEVONthink or maybe a BBEdit style set of CLI integration tools that enable DEVONthink customers to apply Ai model’s directly within DT. The scripting hooks are already there for AS/JS use, so why not allow Ai tools to access those hooks directly. It’s an intriguing thought, even if it’s not an immediate priority. :slight_smile:

I expect that both Google and OpenAI will follow Anthropic’s example and create Terminal access to their Ai models for coding workflows. And if Anthropic find that a healthy number of $20 per month customers trade up to Max accounts, I’m sure Google and OpenAI will follow their business model.

Further info ref using the $20 Anthropic subscription with Claude Code is available here (the second link is a snappy playlist to get you up and running with Claude Code):

But for what purpose - what’s your actual use case?

1 Like

The proverbial “how long is a piece of string”. :slight_smile:

Most of the Ai features in DT4 are chat based and low-hanging fruit in terms of ChatGPT style workflows. But in much the same way as one can use Keyboard Maestro to string together a combination of actions into a macro for use with DT (those macros might in turn, lean on bundled AppleScript’s that can be installed with DT), it would be super useful to be able to create automation macros directing in DT with natural language prompts, and for those macros to be such that their optimised for multiple application workflows across macOS (and the wider Apple platform ecosystem).

Jim, I know you personally and the wider DT team members are AS ninjas, but most of your customers don’t have the same skill-set when it comes to macOS automations generically and DT specifically.

I’ve also noticed a certain amount of scepticism from DT team members, with regard to Ai assisted scripting/programming in general.

I would contend that Ai is proving itself to be increasingly useful for the creation of bespoke automations at both OS and individual application level. I don’t often highlight Microsoft as exemplars in this area, but they’re doing great automation stuff, particularly with Copilot in Excel.

Yesterday’s WDC was littered with examples of Apple using their new “App Intents” framework for providing context driven hooks across macOS and individual Applications for smart Ai assisted automations. To quote from the docs:

"The App Intents framework provides functionality to deeply integrate your app’s actions and content with system experiences across platforms, including Siri, Spotlight, widgets, controls and more. With Apple Intelligence and enhancements to App Intents, Siri will suggest your app’s actions to help people discover your app’s features and gains the ability to take actions in and across apps.

Even if you ignore the existence of Anthropic, Google and OpenAI, all major macOS developers need to be considering the App Intents frameworks, even if it’s simply to provide automation hooks delivered by the new (Alfred/Raycast “sherlocked”) Spotlight, alongside Shortcuts, Apple Intelligence etc, etc.

I’m one of those that considers this year’s WDC to be one of the most consequential in the last 10 years or so. It was remarkable how little filler material there was in the keynote. For me, this was a return to Apple at their best. This was Apple saying judge us on the sum of our parts.

I was surprised by how unapologetic Apple were regarding the over-sell of Apple Intelligence last year, but it was clear that yesterday’s WDC keynote was what they should have shown in 2024, if only they were ready and didn’t feel compelled by outside forces to show their Ai hand too early. Either way, it was crystal clear that they’re fully aware of the roasting they’ve been getting since the rollout of Apple Intelligence (there was no sign of the age-old Apple “reality-distortion field”), and they’ve answered with strong words (in many ways, quietly spoken).

If you were looking for Apple’s next big thing, you’ll have been disappointed, but for the observant, you’d have noticed that Apple’s biggest selling point for most of us loyalists - it’s all about the sum of the Apple ecosystem’s parts. And natural language automation has been a key OS differentiator for Apple ever since AppleScript was introduced in System 7 in the early nineties. It’s a pity that computing has taken 30ish years for the potential of natural language automation to finally deliver on it’s initial goals, but I strongly believe Apple is better placed than Microsoft to deliver user focused NLP automation workflows that work consistently across their multiple device platforms.

1 Like

Thanks for the clarification and an interesting perspective on the keynote (which I found underwhelming and wrongly focused on customers, not developers - but that’s Apple nowadays. I can’t imagine any developer thinking “Oh heck yes!!! Finally, a liquid glass interface for me to implement!” That’s client-facing advertising. It solves no problem but looks shiny for the customers.).

I can’t speak for development, but I don’t see App Intents as any great advancement except in the simplest use cases. There is still far more power in scripting than in being able to say, “Hey, Siri - open that Markdown file I made in DEVONthink yesterday.” And Shortcuts looks like it still falls way short of being a real useful automation tool. It even failed as an Automator replacement, let alone superseding AppleScript. Any tool claiming to be for automation but lacking a debugger is not an automation tool.

I am not running the beta yet, but I think Spotlight won’t replace Alfred, et al, for many people in the same way Shortcuts won’t/can’t replace AS/JXA/shell.

That being said, development has their own ideas and direction so who knows what they’ll come up with.

PS: Don’t misunderstand my (our?) stance on automation as elitist or para-Luddite.

5 Likes

I concur. From what I read of the Intents documentation, I don’t see much progress for automation. Shortcuts is still a mess. Not because it uses a visual programming paradigm, but because it’s badly done. No debugger, not very stable.
And now they want developers to connect their apps to Shortcuts. Why? There’s AS, there’s JXA, both could be good for automation if they weren’t abandonware.
Instead we get shortcuts that is not good and intents but no fixes for technology that is already there.
And I’d bet that this stuff will be as dead in the water as Automator in three years time.
They are simply not very good in sticking with a product and continuously developing it.

3 Likes

I didn’t mention the presentation layer design intent of 'liquid-glass", but I do understand Apple’s design strategy - much as each mention of “liquid-glass” grated.

To be frank though, your sneering attitude with reference to the purpose of design, speaks volumes. When working with iOS and iPadOS, interaction design is at least 60% of the engineering effort, and iOS/iPadOS users make up a far larger customer segment than macOS users. The UI design mechanism of a glass layer which expands and then contracts to allow the app content to populate the majority of the available presentation real-estate is a worthy goal, that was well delivered. And by the way, developers get the core of the move to the new design strategy for free. Recompile your existing app with Xcode 26 and the core of the new design framework will be in place. The developer will of course need to customise things to get the most from the benefits of new design treatment.

The design changes to iPadOS are monumental, but of course these are the changes iPad Power users have been screaming out for since the very first iPad Pro, and have doubled down on since the M-series powered iPad Pro’s.

I am not running the beta yet, but I think Spotlight won’t replace Alfred, et al, for many people in the same way Shortcuts won’t/can’t replace AS/JXA/shell.

I’ve been a Mega Supporter of Alfred since it’s earliest days, and I’m also a Pro + Ai customer of Raycast. I don’t see them as competitors, as each his its own strengths. And Raycast is my preferred way of running Ai automations on DEVONthink content without the need of an API key. Regarding the changes to Spotlight itself in macOS, it goes without saying that Alfred and Raycast are far more powerful, but Apple providing a similar feature-set natively can only be a good thing for macOS itself to further its reputation of being a power-user friendly OS. However, whatever macOS’s reputation for power users, the reality is that in the bigger scheme of things, power users make up a slender piece of the overall user community pie, so Apple putting Spotlight front and centre once again can only be a good thing for all. If mainstream macOS customers get on board with the power of launcher based automations, that will encourage Apple to provide deeper hooks for 3rd party developers to leverage.

But hell, I remember when Spotlight was first added to OS X 10.4 (Tiger) in 2005. One of my work colleagues proudly scoffed at my use of Quicksilver (which was all the rage 20 years back), believing Spotlight would retire launchers like Quicksilver to the wastebasket. How wrong he was!

Don’t misunderstand my (our?) stance on automation as elitist or para-Luddite.

I didn’t suggest for a second that you and anyone else at Devon Technologies are elitists, although I do remember only recently you’re berating of a DT user for their inability to debug via the built-in Script Editor when they currently use Script Debugger (but worried with Script Debugger being EOL’d it would effect his ability to write and debug AppleScript). I didn’t think that particular customer support query was dealt with in a sympathetic manner.

To sign off, I’m a creative strategist by trade, with a specialism in interaction/experience design. I’ve been doing this since the mid-nineties, and have never encountered a situation where design wasn’t considered an integral engineering task, where programmers and visual designers work together in tandem, from project inception through to delivery. As to yesterdays keynote being a sales pitch to end users, that’s always been the nature of the keynote, but what follows is the in-depth developer content. If you go to the Apple Developer YouTube, there’s already 120 videos providing developer specific content relating to the new keynote announcements.

2 Likes

Could you clarify how this works?

It is only fair to point out that a design change takes much more room in a developers conference than developer issues. There’s nothing “sneering” about that – developers are riddled with years old bugs eg in PDFKit and Apple seems to focus on the visual appearance.
Again: on a developers conference. Developers can do nothing with our about the design, it’s simply thrown at them as at everybody else.

3 Likes

UX Designer here. Love Apple. Liquid Glass is quite underwhelming. A bit shallow if compared to white Ives and Altman are said to cooking.Shortcuts? oh, please. Apple needs to make some REAL user experiences AND, why not say it? COMPUTING, better badly. Eye candy doesn’t cut it anymore. It is quite frightening, actually. And, developers, programmers, etc., where always my best friends. Since Visicalc and Excel, and Hypercard. Much better to give customers what they didn’t even thought they needed, than what they want right now, especially in these influencer days.

4 Likes

I remember much the same tirades against shallow design decisions when iOS 7 was first introduced. And then within 2 years it wasn’t just the rest of the Apple ecosystem that was moving away from skeuomorphism, both mobile and desktop systems of competitive OS’s were taking a lead from Apple.

Did Apple get things right on first pass, most certainly not, but the design strategy was correct and iterative feedback cycles made good on specific bad choices.

As a Vision Pro user, I can see where the glass metaphor originated. But, I’m glad that for once Apple has thought about new interaction design choices across the whole ecosystem - watchOS, iOS, iPadOS, macOS and VisionOS. We’ve yet to see if it’s the right decision, but developer betas are showing good promise.

BTW, it’s worth stating that at no point have I defended the glass presentation layer in isolation. I’ve always spoken with regard to the overarching interaction design strategy. And part of that strategy is the integration of machine learning operations throughout the user experience, which means that ai access nodes are at point of need, rather than via a singular all powerful chatbot application layer.

The elephant in the room is Siri, but having watched the post State of the Union interview on Toms Hardware’s YT channel (Apple shamelessly didn’t do this with John Gruber this year, as he posted a critical piece in March ref Siri/Apple Intelligence); it appears the v2 re-architecting of Siri won’t appear until some point in 2026. Seeing as the computing industry as a whole are betting big on voice controlled, vision computing being a significant part of the mobile computing mix, I can’t see Apple making this public until they’re certain the launch will be embarrassment free. The risk here is that major LLM’s may have replaced Siri for the majority of Apple’s mobile customers if they delay too long. The Perplexity app already integrates reasonably well with Apple’s key productivity apps on iOS/iPadOS. Apple holds the trump card in the sense that Siri will be able to do far more than a third party LLM as it will have deeper hooks, plus it will function privately on device and/or on the private cloud where Apple’s foundation LLM is hosted.

When using Raycast Ai with a third party application, you simply pre-select the text or documents that your LLM needs as input tokens, and you then prompt the LLM based on that pre-selection. It’s worth mentioning that I index content with DT, rather than importing, but the workflow should work equally well with imported docs/content.

Any of the key ai workflows the DT4 docs mention, such as e.g. summarisation, tag suggestion, semantic concept linking will all work equally well with Raycast Ai. And much like Perplexity/Kagi, Raycast provides all the main LLM models and you can use those models for multimodel purposes, not just text. It’s slightly less convenient than using an API key solution, in the case of e.g. tag suggestions, but overall it’s very flexible.

The main limitation with Raycast Pro Ai, is that you get finite request limits. However, these request limits are very generous. Raycast Pro provides 50 requests per minute to a maximum of 300 requests per hour for the fast lower cost Gemini, Claude and OpenAi models. The Advanced Ai models are only available with the Advanced Ai account, and these provide 75 requests per 3 hours, up to a maximum of 150 requests per 24 hours. But you don’t require the advanced Ai models for typical DT4 workflows.

A complete breakdown of all available models is available here.

And the full Raycast Ai documentation is available here.

1 Like

I’m entitled (since Apple IIe) to tirades against Apple :rofl:
Consistency throughout the whole ecosystem? Yes and No. iOS apps on a desktop? Means surrendering to the former. Maybe that’s what most consumers want… But if this is the case, a much better paradigm would emerge if Ive + Altman pull that rabbit out of their hats - which is not at all a given. AI access at a point of need? Well goes without saying that a single chatbot app layer is prehistoric (but a necessary part of getting where we are) - but AI access at point of need is also a rudimentary step in the goal of actually having a fully “intelligent” - and ideally invisible - device with a decent natural speech recognition.
What is slightly irksome is Apple always selling stuff as game changing and then letting all of us down - from Hypercard to automation. So Liquid Glass (and I know you’re not defending it isolation, sure) annoys me and the promise to leverage Shortcuts is not to be seriously believed.
Apple has been late at the game, and always too little. Computing could be way better than this. When I mean computing, I mean Hypertext, Engelbart all the way to what Ive had been advocating.

2 Likes

It sounds like we both come from the same place in terms of the better pillars of what Apple has brought to the world of computing. With Bill Atkinson’s death this week, Hypercard has very much been front of mind for me. So much so that I’ve been re-reading Folklore.org: Joining Apple Computer beginning to end.

We do :slight_smile:

(and possibly even from the same places he he)

Yep, sad to hear about Bill Atkinson’s passing. I also loved his photography and have his “in the stone” book.

Hypercard was my nirvana, and after it was decommissioned, I took up SuperCard etc.

Myst.

1 Like

I forgot to mention, I’m with you 100% on this. The slow iOS-ification of macOS grates on me more than any other incremental strategic pivot by Apple over the last ten years or so. Luckily, very little of my core macOS headspace involves native Apple apps directly, even if I rely on deeper Apple API’s.

thanks for this post.

quick clarification question: cursor seem to already use models like claude. what’s the advantage of using cursor ($20) + paying additionally for claude code (another $20)?

thanks!

Very well written comment, thank you for the time of putting this together :+1:t2:

1 Like

Claude Code is a specific CLI tool-set for interacting with Claude ai models for programming purposes. This is different to using Cursor Pro to access Claude ai models for programming. They each provide different workflows towards the same end goal. I know it sounds confusing, but the video below is a good explainer for the benefits of using both in tandem. However, if budget is a consideration, and you’re only interested in using Claude ai models for programming the CLI tool Claude Code would be my primary pick, as long as you’re happy working in a Terminal interface be that the default tool that ships with macOS or any of the third party replacements.

If however, you want the freedom to use multiple ai models (e.g. those from Google and OpenAI as well as those from Anthropic) for programming tasks, Cursor will be a better option.

Here’s that short video explaining the benefits using both Claude Code and Cursor in tandem.

1 Like

thanks a lot for this clarification, that helps!

The Raycast developers have responded to the changes to Spotlight in macOS 26, and I’m happy to say they echo’d my initial thoughts about what it means to the launcher application category.

TLDW - The Raycast team believe they could/should significantly grow their user-base as Apple are educating mainstream users about the possibilities of using launcher tools.

I’ll happily use Spotlight alongside Alfred and Raycast, as each shine in different areas.