Academic Deep Research

Transparent, rigorous research with full methodology — not a black-box API wrapper. Conducts

Academic Deep Research is a community skill for rigorous research with transparent methodology, covering exhaustive investigation through mandated two-cycle research processes, source verification, comprehensive literature review, methodology documentation, and systematic evidence synthesis for academic-quality analysis.

What Is This?

Overview

Academic Deep Research provides structured, transparent research processes that prioritize rigor over speed. It covers exhaustive investigation through mandated two-cycle research that requires multiple search and synthesis passes, source verification that validates the credibility and relevance of all cited materials, comprehensive literature review that identifies key papers, authors, and theoretical frameworks, methodology documentation that records the complete search strategy and analysis approach, and systematic evidence synthesis that combines findings from multiple sources into coherent conclusions. The skill helps researchers conduct thorough investigations with full accountability and reproducibility, making it particularly valuable for work that will be peer-reviewed or used to inform high-stakes decisions.

Who Should Use This

This skill serves academic researchers needing comprehensive literature reviews, policy analysts requiring evidence-based research with documented methodology, graduate students preparing dissertation proposals, and research teams prioritizing depth and rigor over quick answers.

Why Use It?

Problems It Solves

Black-box research APIs provide conclusions without revealing search strategies or source evaluation criteria. Single-pass searches miss important papers and perspectives that require iterative query refinement. Rapid research sacrifices thoroughness and may overlook contradictory evidence or alternative interpretations. Research without documented methodology cannot be reproduced or verified by other investigators, undermining the credibility of findings and limiting their usefulness to the broader research community.

Core Highlights

Two-cycle research engine requires multiple search and synthesis passes for completeness. Source validator checks credibility, relevance, and citation quality of all materials. Literature mapper identifies key papers, authors, and theoretical frameworks systematically. Methodology tracker documents search strategies and analysis approaches for full transparency.

How to Use It?

Basic Usage

research_task:
  topic: "Climate change impact on agriculture"
  depth: deep
  cycles: 2
  requirements:
    - comprehensive literature review
    - source verification
    - methodology documentation
    - evidence synthesis
  output_format: structured_report

Real-World Examples

cycle_1:
  search_queries:
    - "climate change agricultural productivity"
    - "crop yield temperature effects"
    - "extreme weather farming impact"
  sources_found: 45
  key_themes:
    - yield reduction in major crops
    - adaptation strategies
    - regional variation in impacts
  gaps_identified:
    - limited data on developing nations
    - need for economic impact analysis

cycle_2:
  refined_queries:
    - "agricultural adaptation developing countries climate"
    - "economic costs climate agricultural losses"
  additional_sources: 28
  synthesis:
    - integrated findings across regions
    - identified contradictory evidence
    - documented methodology limitations

report:
  methodology:
    - search databases used
    - inclusion and exclusion criteria
    - quality assessment framework
  findings:
    - comprehensive literature synthesis
    - evidence strength ratings
    - conflicting interpretations noted
  limitations:
    - publication bias considerations
    - gaps in current research

Advanced Tips

Document search queries and databases used in each cycle to enable full reproducibility of the research process. Use the second cycle to specifically address gaps and contradictions identified in the first pass, including searching for dissenting studies that challenge dominant findings. Rate evidence strength explicitly to distinguish between well-supported findings and preliminary or contested conclusions, using established frameworks such as GRADE where appropriate.

When to Use It?

Use Cases

Conduct a comprehensive literature review for a dissertation proposal with full methodology documentation. Prepare evidence-based policy briefs that require transparent source evaluation and systematic synthesis. Investigate complex research questions where depth, thoroughness, and reproducibility matter more than speed. This approach is also well-suited to systematic reviews in clinical, social science, or environmental research contexts.

Related Topics

Academic research, literature review, systematic review methodology, evidence synthesis, research transparency, and reproducible research.

Important Notes

Requirements

Clear research question or topic with defined scope for focused investigation. Access to academic databases and search tools for comprehensive literature coverage. Time commitment for multiple research cycles and thorough synthesis of findings.

Usage Recommendations

Do: document every search query, database, and selection criterion for full transparency. Use the second research cycle to address gaps and verify contradictory evidence from the first pass. Rate the strength of evidence explicitly to distinguish well-supported claims from preliminary findings.

Don't: skip the second cycle even when initial results seem comprehensive since iterative refinement uncovers important sources. Accept sources at face value without evaluating credibility, methodology, and potential biases carefully. Rush to conclusions before synthesizing evidence systematically and considering alternative interpretations.

Limitations

Two-cycle research requires significantly more time than single-pass approaches and may not suit urgent deadlines. Access to paywalled academic journals and databases may limit the comprehensiveness of literature coverage. Human judgment in source evaluation and synthesis introduces potential bias despite systematic methodology.