Latest Posts

Why These Contract Clauses Are Scary

Header image: the word Contract etched in concrete, with a crack between "Con" and "Tract" dividing the word (Credit: Lane V. Erickson / Shutterstock.com)

The clauses I'm going to discuss in this post come from the contract of Shadow Light Press, a publisher of fantasy and science fiction with deep ties to the LitRPG community. The contract stirred controversy this week when it was publicly posted on Reddit--largely from people aghast at its author-unfriendliness, but also because of alleged issues around the publisher's relationship with a major progression fantasy and LitRPG Discord server.

I'm not going to go into those issues here (you can find out more, if you like, from the comments on the Reddit post). My focus will be on the contract itself (which I can confirm is authentic: I've seen several other copies). My intent--as always with my posts about publishing contracts--isn't just to call out a publisher for problematic language, but to explain why it's problematic, with the goal of empowering my readers to better evaluate the contracts they may be offered. New writers, who may be less knowledgeable or dazzled by the prospect of publication, are especially vulnerable to predatory contracts like this one.

Oh, and I'm not a lawyer. So what follows isn't legal commentary or advice.

Kindle’s New Gen AI-Powered “Ask This Book” Feature Raises Rights Concerns

Header image: a Kindle device with the screen showing the Kindle logo, against a blurry background of book covers (credit: Matthew Nichols1 / Shutterstock.com)

In a recent press release, Amazon noted that some new features were coming to Kindle.

We’re adding new AI-powered reading features that preserve the magic of reading on Kindle. Story So Far lets you catch up on the book you’re reading—but only up to where you’ve read without any spoilers. For our endlessly curious readers, Ask this Book will let you highlight any passage of text while reading a book and get spoiler-free answers to questions about things like a character’s motive or the significance of a scene. 

The lead article in today's Publishers Lunch (PL) is all about Ask This Book, which went live in the Kindle iOS app earlier this week (it'll be rolled out on all devices and Android OS in 2026). Amazon's breezy announcement of the feature's debut describes it as "your expert reading assistant, instantly answering questions about plot details, character relationships, and thematic elements without disrupting your reading flow." You can highlight a phrase or sentence, type a question into a search box, and AI will generate an answer "right on the page." There's a little video to demonstrate the process.

Royalties in Arrears: Mango Publishing / Blushing Books / Bottlecap Press

Header image: magnifying glass showing caution sign (red triangle/red exclamation point) surrounded by red exclamation points on bright yellow background (Credit: Ohayo style / Shutterstock.com)

Publishers do a lot of bad things (as the archives of this blog attest), but among the most infuriating--and, often, the hardest to remedy--is the failure to pay authors the money they are due. Non-payment of royalties and/or failure to provide sales reporting are among the most common publisher complaints Writer Beware receives.

Below, you'll find a collection of recent offenders.

In January 2025, Publishers Lunch reported on layoffs at Florida-based Mango Publishing, along with the departure of publisher Brenda Knight to form her own company.

If a Famous Author Calls, Hang Up: Anatomy of an Impersonation Scam

Header image: a business-suited wolf, hiding behind a smiling mask, facing an unsuspecting woman who is about to be scammed (Credit: dariodraws / Shutterstock.com)

You open your email program one morning. The usual work stuff. Some spam (annoying that it got past your filters!). A couple of newsletters (maybe later). You sip your coffee, scroll down.

Wait. What's this? An email from...Suzanne Collins? The Suzanne Collins?

This can't be real, you think. Why would Suzanne Collins be contacting you out of the blue? And why is she introducing herself as if she were an unknown writer querying for her unpublished manuscript?

Predatory Opt-Outs: The Speculators Come for the Anthropic Copyright Settlement

Header image: an iPhone screen with the Anthropic logo, against a multi-colored background of $100 bills (Credit: Ascannio / Shutterstock.com)

The enormous, $1.5 billion Anthropic copyright class action settlement is reputedly the biggest copyright infringement recovery in history. With such a high-profile case, it's inevitable that eligible authors aren't the only ones looking to benefit.

Yesterday, the Publishers Lunch newsletter published a story about an Arizona law firm called ClaimsHero that has mounted a push, complete with social media ads, to encourage authors to opt out of the Anthropic settlement. Why? Presumably, because authors who opt out preserve their right to sue Anthropic, and ClaimsHero wants to identify clients for a possible class action lawsuit of its own that could enable it to reap a big payout on contingency.

ClaimsHero, which appears to be the kind of law firm that advertises on billboards along the highway, has created a webpage for this effort that frames opting out in terms of money (of course). If your work is included in the settlement, why settle for a measly $3,000 when you could receive up to $150,000, the maximum amount of statutory damages available for willful copyright infringement? (Emphasis added):

The Anthropic Class Action Settlement: What You Need to Know Right Now

Header image: multicolored, backlit letters AI repeating diabonally on a yellow background (Credit: Steve Johnson / Unsplash.com)

One of the most urgent issues confronting writers and other creators right now is the use of copyrighted material for generative AI training.

The large language models that power chatbots like OpenAI’s ChatGPT and Anthropic’s Claude require “training” via the ingestion of vast amounts of text, images, and other materials scraped from the internet or incorporated into databases created by AI companies.

Much of this material is protected by copyright. For the most part, AI companies have not sought permission from rights holders for exploiting their work in this way (nor did creators even start discovering the extent of the companies’ use of their content until a few years ago). They claim permission isn’t needed because AI training falls under the definition of fair use—a limited and transformative use of the material that does not require the copyright owner’s agreement (or remuneration).