๐Ÿ’ปInformation Systems

Fundamental System Development Life Cycle Phases

Study smarter with Fiveable

Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.

Get Started

Why This Matters

The System Development Life Cycle (SDLC) is the conceptual framework that explains how and why information systems succeed or fail. When you're tested on SDLC, you're really being asked to demonstrate understanding of project management principles, requirements engineering, quality assurance, and continuous improvement. These phases show up in every real-world IT project, from building a simple database to deploying enterprise-wide systems.

For your exam, you need to understand the logical flow between phases and recognize that each phase addresses a specific type of risk. Planning mitigates scope creep. Analysis prevents building the wrong system. Testing catches defects before users do. Don't just memorize the phase names. Know what problem each phase solves and how skipping or rushing a phase creates downstream failures.


Foundation Phases: Defining the Problem

Before any code gets written or any system gets designed, organizations must establish what they're building and why. These phases prevent the most expensive mistake in systems development: building something nobody needs.

Planning

Planning is about answering one big question: should we do this project, and if so, what are its boundaries?

  • Project scope and objectives establish boundaries that prevent scope creep and ensure alignment with strategic business goals. A well-defined scope statement makes every later decision easier.
  • Stakeholder identification and initial requirements gathering create the foundation for all subsequent phases. Missing a key stakeholder here causes costly rework later, because their needs won't surface until the system is already built.
  • Feasibility analysis evaluates whether the project is viable across multiple dimensions: technical (can we build it?), economic (is the ROI worth it?), and operational (will people actually use it?).
  • Risk assessment and mitigation planning addresses potential failures proactively, including budget overruns, timeline delays, and technical obstacles.

Analysis

Analysis shifts from "should we build it?" to "what exactly should it do?" This is where vague business needs get translated into specific, documented requirements.

  • Requirements analysis distinguishes between what users say they want and what they actually need. Users might ask for "a faster report," but analysis reveals they need real-time dashboard access instead.
  • Gap analysis compares current-state systems against desired functionality to identify specific improvements needed.
  • Functional vs. non-functional requirements documentation separates what the system does (process an order, generate a report) from how well it performs (response time under 2 seconds, 99.9% uptime, encryption of sensitive data).

Compare: Planning vs. Analysis: both gather requirements, but planning focuses on feasibility and scope while analysis dives into detailed specifications. If a question asks about "determining whether to proceed with a project," that's planning. If it asks about "documenting user needs," that's analysis.


Construction Phases: Building the Solution

Once requirements are locked, the focus shifts to creating the system. These phases transform abstract requirements into concrete, functional technology through systematic design and development practices.

Design

Design is the blueprint phase. No code is written yet, but every major technical decision gets made here.

  • System architecture defines how components interact. This is where decisions about centralized vs. distributed systems, cloud vs. on-premise hosting, and integration points get made.
  • User interface (UI) design directly impacts adoption rates. Poor usability is a leading cause of system rejection, even when the underlying technology works perfectly. Wireframes and prototypes are common design deliverables.
  • Data models and database design establish how information flows and is stored, affecting system performance and reporting capabilities.

Implementation

Implementation is where the system actually gets built and deployed. It covers coding, integration, and preparing users for the transition.

  • Development execution follows design specifications. Deviations here introduce technical debt, meaning shortcuts that create maintenance headaches down the road.
  • System integration and data migration connects new systems to existing infrastructure. This is often the riskiest technical activity in the entire SDLC because legacy data can be inconsistent, incomplete, or formatted differently than the new system expects.
  • User training bridges the gap between a technically functional system and one that actually gets used. Change management is as important as code quality. A perfect system that nobody knows how to operate delivers zero value.

Compare: Design vs. Implementation: design answers "what will we build?" while implementation answers "let's build and deploy it." Exam questions often test whether you can identify which phase a specific activity belongs to. Creating wireframes = design. Writing code = implementation. Training users = implementation.


Quality Assurance Phase: Validating the Solution

Testing isn't just "checking if it works." It's a systematic process of verifying that the system meets specifications and validating that it solves the original business problem. Those two words matter: verification asks "did we build it right?" and validation asks "did we build the right thing?"

Testing

Testing follows a hierarchy, where each level catches different types of problems:

  1. Unit testing validates individual components in isolation (does this one function calculate tax correctly?).
  2. Integration testing checks whether components work together properly (does the shopping cart pass the right data to the payment processor?).
  3. System testing evaluates end-to-end functionality against technical specifications. The IT team performs this.
  4. User acceptance testing (UAT) is the final gate. Business users test the system against real-world scenarios to confirm it meets their actual needs.

Beyond functional testing, performance and load testing ensures the system handles real-world conditions like peak usage times, large data volumes, and concurrent users.

Test case documentation creates repeatable, traceable validation procedures. Undocumented testing is essentially worthless for audit purposes because you can't prove what was tested or demonstrate compliance.

Compare: System Testing vs. User Acceptance Testing: system testing is performed by IT teams against technical specifications, while UAT is performed by business users against real-world scenarios. Both must pass before go-live, but they catch different types of defects. A system can pass every technical test and still fail UAT because it doesn't match how users actually work.


Sustainment Phases: Ensuring Long-Term Value

A system's launch is just the beginning. These phases ensure the investment continues delivering value and adapts to changing business needs over time.

Maintenance and Support

Once a system is live, it needs ongoing care. Maintenance keeps the system running and relevant.

  • Ongoing user support resolves issues and maintains productivity. Support ticket patterns often reveal design flaws or training gaps that weren't caught earlier.
  • System updates fall into three categories:
    • Corrective: bug fixes
    • Perfective: performance improvements and usability enhancements
    • Adaptive: changes made in response to new business requirements or external factors (like regulatory changes)
  • Performance monitoring uses metrics and dashboards to detect degradation before users experience problems.

Evaluation

Evaluation zooms out from day-to-day operations to ask the bigger question: is this system actually delivering the value we expected?

  • Performance assessment compares actual outcomes against the objectives established during planning, closing the accountability loop.
  • Post-implementation review captures lessons learned while they're fresh. Organizations that skip this phase repeat the same mistakes on the next project.
  • Stakeholder feedback informs the next iteration or project, connecting evaluation back to planning in a continuous improvement cycle. This is what makes the SDLC a cycle rather than a one-way process.

Compare: Maintenance vs. Evaluation: maintenance is operational (keeping the system running day to day), while evaluation is strategic (determining if the system delivers business value). Both happen post-implementation, but they serve different purposes and involve different stakeholders.


Quick Reference Table

ConceptBest Examples
Risk MitigationPlanning (risk assessment), Testing (defect detection)
Requirements EngineeringAnalysis (detailed requirements), Planning (initial scope)
Technical ConstructionDesign (architecture), Implementation (development)
Quality AssuranceTesting (all levels), Evaluation (performance review)
Change ManagementImplementation (training), Maintenance (user support)
Continuous ImprovementEvaluation (lessons learned), Maintenance (updates)
Stakeholder EngagementPlanning (identification), Analysis (validation), Evaluation (feedback)

Self-Check Questions

  1. Which two phases both involve gathering requirements, and how do they differ in scope and depth?

  2. A company discovers after launch that their new system can't handle peak holiday traffic. Which phase failed, and what specific activity should have caught this problem?

  3. Compare and contrast system testing and user acceptance testing. Who performs each, what do they validate, and why are both necessary?

  4. If a post-implementation review reveals that the system doesn't align with business goals, which earlier phase likely had deficiencies, and what activities should have prevented this?

  5. A scenario describes users refusing to adopt a new system despite it being technically functional. Which phase activities address this problem, and what should have been done differently?