Refinement Guide — From Idea to Testable Question
Turning an interesting idea into a concrete, feasible project is often the hardest step.
This guide helps you refine your concept into a well-structured proposal that can be approved and completed within one semester.
This guide is meant to help you think clearly, not to restrict you. Use it as a framework for reflection — a way to check that your idea has a clear question, feasible plan, and measurable outcome. As your understanding deepens, your question or methods may evolve — that's normal and encouraged. Refinement is part of the research process, not a failure of planning.
Step 1 — Define the Core Question
A good CACoM project starts with a focused, testable research question.
Avoid broad or vague goals like “analyze clinical data” or “build an ML model.”
Even more importantly, avoid non-ideas — projects that miss the purpose of this course entirely.
Instead, aim for specific, measurable objectives.
If you're reading this and thinking,
“Okay, but how do I actually refine my idea? I know nothing about this topic…” —
that's perfectly fine. Recognizing that gap is your first real step toward doing research.
CACoM is a rare opportunity.
You have access to an unusually open combination of data, hardware, collaborators, and clinicians — resources that most university courses would never share so freely.
We are generous with this access because we value initiative and curiosity.
But generosity only flows one way: to those who take a half-step toward it. If you feel lost, start reading — papers, textbooks, online sources. Ask questions. Talk to instructors, collaborators, and guest speakers. Use class time to probe and clarify. Refinement is an active process, not something that happens automatically.
And a final warning:
If you plan to “surf along” until the last week and then cram in a quick project — this course is not for you.
Success in CACoM comes from early, consistent engagement and genuine curiosity.
Examples: from vague to concrete (and what not to do)
| Type | Example | Why |
|---|---|---|
| 1. Vague idea | “Predict complications in pregnancy.” | Too broad — which complications, which data, and what prediction horizon? |
| 1. Refined question | “Can features extracted from CTG recordings predict fetal distress, as measured by arterial pH, within the last hour before delivery?” | Specific, measurable, clinically relevant, and computationally feasible. |
| 2. Vague idea | “Use IMU data for gait analysis.” | Needs clear target and metric. |
| 2. Refined question | “How accurately can a ZUPT-based algorithm estimate step length in treadmill vs. overground walking?” | Defines method, evaluation, and context. |
| 3. Vague idea | “Do NLP on clinical notes.” | Unclear goal, no metric. |
| 3. Refined question | “What is the accuracy of a fine-tuned transformer model for classifying discharge summaries by diagnosis category?” | Defines dataset, model, and measurable outcome. |
| 4. Vague idea | “Build a heart rate simulator.” | Broad and hardware-centric — unclear what the simulator is for or how it will be evaluated. |
| 4. Refined question | “Can a mechanical fetal heart rate simulator driven by real CTG recordings reproduce realistic Doppler signals that could be used to test fetal monitoring algorithms?” | Well-defined engineering experiment: connects hardware design with a measurable research objective. |
| 5. Non-idea | “Explore the use of ChatGPT in the context of cachexia in cancer patients.” | Verifying or benchmarking LLM outputs for clinical tasks is, at present, nearly impossible and outside CACoM's realistic scope. |
| 6. Non-idea | “Develop a lightweight cognitive-assessment game using Unreal Engine 5.” | Building a tool or GUI is not a research question. CACoM is about using tools to answer a clinical or computational question — not about making the tools themselves. |
| 7. Non-idea | “Create a website/app to display hospital statistics.” | Pure software engineering; no hypothesis or analysis. |
| 8. Non-idea | “Use AI to predict anything from any dataset we can find online.” | Unfocused, unverifiable, and ethically questionable without defined data and metrics. |
A strong project question is specific, answerable, and relevant, connects computation to clinical insight, and produces measurable outcomes that can be evaluated within one semester.
CACoM is not a software engineering, or video making, or app development course. It is about understanding and answering clinically meaningful questions using computational and statistical methods.
You are free to use any tool or technology you please — ChatGPT, Claude, MATLAB, Python, Julia, R, TensorFlow, Unity, BLE, IoT, LLMs, GUIs, DaVinci Resolve <Insert Fancy Toolname Here> — but only as means to an end. If your proposal centers on building the tool instead of using it to gain insight into a medical or biological question, you are off-topic and will fail this course.
The course rewards depth of understanding, scientific reasoning, and critical analysis — not the choice of framework, interface, or architecture.
Step 2 — Clarify Your Inputs and Outputs
Whether your project analyzes data, builds a model, or constructs an experimental setup, you must clearly specify what goes in, what comes out, and how success will be judged.
| Element | Ask yourself |
|---|---|
| Data or system | What dataset, signal source, or experimental setup will you use? If you plan to collect your own data (e.g., with an IMU, stethoscope, or simulator), describe how and under what conditions it will be gathered, and ensure ethical and practical feasibility. |
| Inputs / features | What will you measure, extract, or compute? For an experiment: what physical parameters will you record or control? |
| Outputs / targets | What will you predict, estimate, compare, or reproduce? For an engineering project: what measurable signal or behaviour do you expect to obtain? |
| Success metric | How will you decide if your approach or prototype worked? Define quantitative criteria — e.g. algorithm accuracy, signal fidelity, reproducibility, or agreement with a reference. |
If your data are not yet available, define synthetic, simulated, or publicly available substitutes for early prototyping and validation.
Projects involving engineering or experimental work must include a computational or analytical evaluation — for example, verifying that a device produces realistic signals, or that a simulator can be used to test algorithms. Building hardware or software alone is not enough; the outcome must still answer a scientific or clinical question.
Step 3 — Frame the Clinical or Scientific Relevance
Every CACoM project should connect its technical core to a meaningful biomedical or clinical context.
Explain clearly:
- Why this problem matters for medicine, biology, or healthcare.
- How computational or experimental methods can add new insight, enable validation, or improve understanding.
- What a positive or negative result would imply — for instance, better signal reliability, improved diagnostic interpretation, or a validated dataset for future research.
- Who might ultimately benefit (clinicians, patients, data scientists, researchers).
Even if your work is primarily methodological or engineering-focused, the motivation and interpretation must remain clinical or scientific, not technological. Tools and devices are valuable only insofar as they help answer a clearly defined question.
Step 4 — Choose and Justify Your Method
Describe your planned computational, statistical, experimental, or analytical approach — not in exhaustive detail, but enough to demonstrate that your project is feasible and will yield interpretable results.
Possible types of approaches include:
- Computational analysis — ML, signal processing, simulation, or modeling
- Engineering or experimental — sensor design, data acquisition systems, simulators, or controlled experiments
- Systematic reviews or meta-analyses — structured evidence synthesis following transparent inclusion/exclusion criteria
- Survey-based or clinical studies — quantitative or qualitative analyses of responses, performed in collaboration with clinical partners
You don't need to invent new algorithms or instruments. Reproducing, benchmarking, or combining existing methods is fully acceptable if your question is well-defined and your analysis is rigorous. If you are collecting data (e.g., through surveys), focus on how your instrument measures what you think it measures — not just on distributing it widely.
Step 5 — Define Evaluation and Validation Strategy
Every project must define how success or validity will be measured.
The exact form depends on the project type, but it must be quantifiable or systematically verifiable.
| Project Type | Possible Evaluation Criteria |
|---|---|
| Prediction / classification | Accuracy, sensitivity, specificity, F1, ROC-AUC |
| Regression / estimation | RMSE, MAE, R² |
| Signal analysis / simulation | Agreement with reference data, signal fidelity, reproducibility |
| Experimental / hardware | Physical measurement accuracy, stability, response latency, or calibration error |
| Systematic review | Number of studies included, pooled effect size, risk-of-bias scoring, PRISMA compliance |
| Survey-based study | Internal consistency (Cronbach's α), inter-rater reliability, distribution of responses, or qualitative coding reliability |
| Questionnaire design | Construct validity — does the question measure what you intend? Pilot the survey to ensure interpretations align with your aim. |
Be careful what you are actually measuring. For example, asking clinicians “Are you uncertain in borderline CTG cases?” does not measure confidence or competence — it simply restates the definition of a borderline case. Pilot your survey and verify that your respondents interpret questions as you intend.
Evaluation is not just about scoring performance — it's about verifying that your method, model, or instrument genuinely captures what you claim it does.
Step 6 — Assess Feasibility
Before finalizing your proposal, verify that your plan is realistic within one semester and with your available resources.
Check these aspects:
- Data / participants: Are datasets or participants (for surveys) realistically accessible and ethically approved?
- Skills & tools: Does your team have the technical and analytical skills needed?
- Compute / hardware: Do you have access to equipment or computational resources?
- Timeline: Can all planned stages — from pilot testing to analysis — be completed in time?
- Fallback plan: What happens if a dataset, collaborator, or survey response rate falls short?
Projects that are over-ambitious, tool-centered, or methodologically vague rarely succeed.
Choose depth and clarity over scale or novelty.
A small, carefully validated study is far more valuable than a flashy but meaningless prototype or questionnaire.
Step 7 — Assemble Your Proposal
Once your idea is refined and feasible, it’s time to assemble everything into a clear, structured document.
Use the official Project Proposal Template as your guide.
It ensures consistency across teams and helps instructors review your plan efficiently.
Your proposal should capture:
- the motivation and background of your project,
- your research question or hypothesis,
- the data you’ll use or collect,
- your planned methods and evaluation criteria, and
- a short reflection on feasibility and potential risks.
Keep your proposal concise, focused, and realistic — it defines your starting point, not a fixed contract.
Refinements to methods, metrics, or hypotheses as you progress are normal and encouraged.
Submission:
Send your completed proposal (PDF or Markdown) to Prof. Martin Daumer, CC Pooja N. Annaiah, by the topic-approval deadline.
Refinement Checklist
Before moving on to the Project Proposal Guide, confirm that all boxes are checked:
- Defined a specific, testable research question
- Identified data source(s), experimental setup, or evidence base and ensured access or feasibility
- Clarified inputs, outputs, and evaluation criteria (metrics, validation, or analysis plan)
- Motivated the clinical, scientific, or societal relevance of the question
- Outlined feasible methods (computational, experimental, or analytical) and a fallback plan
- Verified that the project can be completed within one semester
- Compiled all information into the official proposal template and prepared it for submission
When every box is ticked, you are ready to submit your formal proposal.
Recommended Workflow
| Week | Activity | Outcome |
|---|---|---|
| 1 | Brainstorm ideas, review the Topic Bank, and form teams | Initial concept or shortlist of topics |
| 2 | First mini-pitch and discussion | Early feedback and feasibility check |
| 3 | Refine question, identify data or collaborators, pilot methods | Draft outline and preliminary validation plan |
| 4 | Finalize methods, evaluation strategy, and proposal template | Ready for proposal submission and approval |
This timeline is indicative. Some projects—especially those involving surveys, clinical collaborations, or experimental setups—may need to start preparations earlier to allow time for approvals and pilot testing.