How to Use Claude Code for Dissertation Writing

Every grad student hits the same wall. You've got a claim, you know the literature supports it somewhere, and now you need to find the specific, recent, peer-reviewed paper that says so. Multiply that by every sentence in your literature review. It's not intellectually hard. It's just slow, and the slowness compounds.

Claude Code with the Scite MCP changes the math on this pretty significantly. You get an AI assistant in your terminal that can actually search scientific literature, not the open web, and tell you not just what's been published but how it's been cited since. Here's the setup and a few workflows that are worth knowing about.

Quick background on what's actually happening here

Claude Code is Anthropic's terminal-based AI tool. Scite is built on a citation graph of over 1.6 billion Smart Citations, which track whether a paper was supported, contrasted, or just mentioned by the work citing it. MCP (Model Context Protocol) is the connector between them. Once you hook Scite into Claude Code, you can ask research questions in plain language and get answers grounded in indexed, peer-reviewed literature, not whatever Claude's training data happens to contain.

The distinction matters. Without the MCP connection, Claude is guessing from memory. With it, Claude is searching Scite's full-text index and citation data in real time.

Setting it up

If you don't have Claude Code yet, install it here. You'll need a Pro, Max, or Team plan. Once it's running, one command in your terminal (not inside Claude Code):

claude mcp add --transport http scite https://api.scite.ai/mcp --scope user

The --scope user flag means Scite stays available across all your projects. You'll authenticate with your Scite account when prompted. To check it's working, run /mcp inside Claude Code and look for Scite in the list.

You need a Scite subscription for MCP access. There's a 7-day free trial if you want to test the workflow before committing.

Finding sources that actually match your claims

This is the pain point we hear about most. One grad student put it simply: "I need peer-reviewed, in-text citations that are not older than 2022 to support every statement in my chapter." That's a grind if you're doing it manually across databases. In Claude Code, it's a prompt:

"Find peer-reviewed studies published after 2022 on the relationship between teacher burnout and student achievement."

What makes this different from a regular search is that Scite indexes full text, not just titles and abstracts. You'll surface papers that keyword searches miss. And because the results come with citation context, you can see immediately whether a study has been supported or challenged by later work, before you even open the PDF.

Checking your references against the citation graph

Drafts accumulate citation errors. Wrong year, wrong first author, or worse, a paper that doesn't actually say what you think it says. Committees catch this stuff, and it's tedious to verify by hand.

Paste a section into Claude Code and try:

"Check the references in this section. Verify that each cited paper exists, the metadata is correct, and the paper supports the claim I'm making."

Claude will cross-reference your citations against Scite's data and flag mismatches. It's not infallible. You should still spot-check. But it catches the kind of errors that creep in over months of drafting and revising.

Seeing how your key sources have held up

A finding from 2019 that's been contradicted three times since isn't a great foundation for your argument. This is the kind of thing that's genuinely hard to check manually and genuinely easy with Smart Citations:

"For the key studies cited in this section, show me whether they've been supported or contrasted by subsequent research."

This is especially useful in the lit review, where your committee wants to see that you understand the state of the evidence, not just that you can list papers.

Building out a bibliography from scratch

Starting a new chapter and need to map the landscape:

"Find the most-cited studies on [your topic] from the last five years. For each, give me the citation, a one-sentence summary, and whether it's been supported or contrasted."

Not a replacement for deep reading. But it gives you a structured starting point in minutes instead of days.

What this won't do

It won't write your dissertation. It won't replace reading the papers. And it's working with Scite's index, which is vast but not exhaustive. If a paper isn't in the citation graph, Claude won't find it through this connection. You'll still want to cross-reference with your discipline's primary databases.

Also worth noting: Claude Code runs on tokens, and heavy research sessions add up. It's a meaningful productivity tool, not an unlimited one.

If the terminal isn't your thing

Not everyone wants to work this way, and that's fine. Scite Assistant and Scite Search offer similar capabilities in the browser. You can ask natural-language questions, get citation-backed answers, and explore how any paper has been cited across the literature. The Scite MCP also works with Claude.ai, ChatGPT, and other MCP-compatible tools. Setup instructions for each are at scite.ai/mcp.

Get started

Connect Scite to Claude Code and see what it does for your workflow. Setup takes a minute: scite.ai/mcp