Process models in the practice of distributed software development
Documents: Advanced Search Include Citations. Authors: Advanced Search Include Citations. Swimlanes are used to represent design responsibilities. Researchers have accordingly developed computable models to study these issues using task precedence networks. The early work focused on application of program evaluation and review technique PERT to lay out the plan of work for development projects and then to focus management attention on the critical path Pocock Other DDP modelling approaches explicitly represent dynamic flows of information in a process using variants of the Petri net.
This is a formal approach which, in its simplest form, represents a process in terms of a network of places and transitions Van der Aalst Appropriately constructed Petri nets allow the dynamic behaviours of serial, parallel, and iterative task patterns to be modelled. For instance, McMahon and Xianyi use a Petri net-based process model as the basis of an automatic controller which directs computer processes to design a crankshaft.
A shortcoming of the Petri net is that logical problems such as deadlocks can appear if the net is not appropriately structured, which becomes more difficult to achieve as the complexity of information flows and the number of possible routes increases. Considering these problems, Ha and Suh develop a set of Petri net templates that each represent a certain pattern of DDP task interactions.
Larger models can then be assembled from these templates. Another issue is that, in the DDP context, it is common that changes in the planned process are required during its execution. This is also difficult to handle using Petri nets. Karniel and Reich address this issue with an approach to automatically generate or update a Petri net from a Task DSM discussed in Sect.
A more descriptively elaborate but less formal computable model based on a graphical precedence approach is the applied signposting model ASM developed by Wynn et al. The ASM is based on a hierarchical flowchart representation intended to be scaleable and familiar to practitioners. Similar to DR, tasks are specified in terms of input and output information, different dependency types can be represented, and an abstraction hierarchy of tasks and design parameters is provided with tool support to automatically generate simplified views Wynn The ASM simulation algorithm was developed to handle processes having multiple intertwined iteration loops, which are difficult to configure in many other approaches.
The ASM was developed and applied through industry collaborations in the aerospace sector. For example, Kerley et al. Hisarciklilar et al. The ASM also laid groundwork for approaches to predict change propagation in a design process Wynn et al. A strength of graphical task precedence approaches is their intuitive flowchart-style notation which can be easily understood by most people.
However, such models also have limitations. As is apparent in the example of Fig. Flows that connect across a long distance of the model are especially difficult to read and manipulate. Some of these difficulties may be partially addressed by organising a model hierarchically into subprocesses, but this can introduce further challenges in managing and visualising connections that cross levels and can also cause problems if the hierarchy later needs to be repartitioned.
Another consideration is that if a model is used for simulation, some schemes require careful configuration and painstaking verification to ensure it operates as intended in all scenarios, especially if it incorporates a dense structure of dependencies with concurrent flows and intertwined iteration loops Karniel and Reich ProModeller is a task precedence approach which is not based on node-arrow diagramming and thus avoids some of these issues.
This system allows modellers to represent a process by hierarchically combining process elements drawn from a standard library comprising around 50 objects Freisleben and Vajna , each representing either a type of task or a structural element.
Tasks can be configured when instantiated into a model. Structural elements are essentially hierarchical containers that specify the procedure for attempting the objects nested within: sequentially; in a cycle of iterative refinement; concurrently; or by selecting one from a set of alternatives Vajna The reflection of process behaviour in structure ensures that models constructed using this approach are logically correct.
This may facilitate the distribution of modelling effort among many process participants. On the other hand, in comparison to graphical network approaches, tree-structured approaches like ProModeller provide less flexibility for modelling complex information flows and arguably a less visually intuitive representation.
The task precedence models discussed in this subsection may be especially useful where design processes are relatively routine, while also involving enough complexity that stakeholders may not fully understand them prior to modelling. These situations do often occur in practice—for instance in the evolutionary development of large-scale designs Wynn et al. The situated and responsive aspects of designing may be embedded in the possibility of some tasks triggering iteration, or may occur within individual tasks and thus be below the level of resolution of a model.
They may also render a model inaccurate if they lead to changes in the tasks that are needed or in the way information flows between them. To recap, task dependency models represent the information dependencies between tasks as well as, or instead of, a procedure for attempting them. Such models emphasise that the tasks could be organised in several ways.
For example, they could be attempted in different sequences or in parallel. Approaches which incorporate dependency models are based on the premise that a process can be improved by studying the underlying structure of the situation.
Figure and caption reproduced from Eppinger et al. The most well-known model in this category is probably the design structure matrix DSM introduced by Steward A DSM is a square matrix in which a mark in a cell indicates that the element in the row depends upon that in the column see the example in Fig. Where the elements represent tasks and the connections represent information dependencies, the matrix is called a Task DSM Eppinger et al.
If all the marks lie below the leading diagonal in one or more of the possible orderings of the rows and columns, the process may be completed by attempting tasks sequentially or in parallel.
Conversely, if it is not possible to find such an ordering, some of the tasks are interdependent and iteration may be required to resolve them Eppinger et al. Algorithms have been developed to analyse a DSM to examine or exploit such structural characteristics. The algorithms include: sequencing, which is attempting to find a lower diagonal reordering, i. The Task DSM has been extensively adopted in research literature as the basis of models to analyse DDP characteristics, especially those related to decomposition and integration.
The key consideration here is that when a high-level task such as designing a system is decomposed into subtasks that will be undertaken by different people or teams, interdependencies are invariably created between those subtasks. One seminal meso-level model considering these issues is the work transformation matrix WTM developed by Smith and Eppinger a. The WTM focuses on situations in which interdependent tasks are executed in parallel with frequent information transfer to manage their interdependencies.
It assumes that each task in such a group continuously creates iteration work for the others that depend on it, at a constant rate. The dependencies and their corresponding rates are represented in a Task DSM. Smith and Eppinger a show how eigenstructure analysis can be used to identify the drivers of iteration within a coupled task group if the WTM assumptions hold. Assuming instead that tasks are executed in sequence, such that each task might create rework for others already completed if a dependency exists between them, Browning and Eppinger build on the earlier work of Smith and Eppinger b to develop a Monte Carlo simulation model which they use to evaluate the cost and schedule risk associated with different task sequences and thereby identify the best sequence for a given task decomposition.
These two models, respectively, described as parallel and sequential rework models, have influenced many other research articles e. The Task DSM provides a compact notation which can be especially useful for processes involving dense structures of information dependency.
Achieving a comprehensible visual layout is likely to be easier than when graphical networks are used. Another advantage is that the approach can be applied without specialised software. Many computations can be expressed and programmed as operations over the matrix cells. On the other hand, some weaknesses are also apparent. DSMs are not well suited to convey detail, and thus, it can be easy to misplace marks when constructing or reading large matrices. It is not clear how to deal visually with opening and closing hierarchical structures in a DSM model.
Sequential and parallel flow structures are difficult to visualise Park and Cutkosky , because, although clusters of tasks can be easily indicated as shown in Fig. More information on the Task DSM and the many related models can be found in Eppinger and Browning and the review article by Browning Another established dependency modelling approach is IDEF0, which uses a hierarchically structured set of diagrams to represent a system in terms of functions and the interactions between them USAF Applied to the DDP, functions are in essence similar to tasks.
Each IDEF0 diagram comprises between three and six functions, which are represented as boxes and interconnected by labelled arrows. Mechanism arrows enter the bottom of a box and indicate provision of a means for executing the function.
Any function box can be decomposed into a more detailed diagram showing its subfunctions. Functions can be linked across and between levels in the hierarchy, and the model may include a glossary of terms USAF A large set of diagrams is often needed, which can be time-consuming to produce Colquhoun et al.
For example, Kusiak et al. ADePT PlanWeaver is a planning support tool for the construction industry which is based on an IDEF0-style representation, enhanced to indicate the discipline associated with each flow into a task, as well as the strength of the dependency Austin et al. In the approach, a library of generic construction processes is used to construct a customised process model for a specific project, which can be viewed as a Task DSM and then sequenced to minimise the scope of cycles that may cause iteration.
Identifying dependency loops that remain and finding ways to eliminate them, for instance by splitting some tasks into several parts, allows the project to be sequenced and a schedule to be produced Austin et al.
More recently, Romero et al. This includes additional symbols to distinguish the main flow of information from other interactions, such as coordination and cooperation, that are needed in a collaborative design process. To summarise, the main advantage of task dependency models is their emphasis on information flow constraints rather than procedures—because understanding constraints is helpful when constructing a plan or seeking opportunities for process improvement.
On the other hand, Austin et al. Task precedence and dependency models as discussed above view DDPs as essentially similar in nature to other business processes, albeit with a high level of uncertainty and with the expectation of iteration. One criticism that might be levelled at such models is that they attempt to represent design processes but do not explicitly integrate an important insight gained from research into the nature of design activity—its situatedness see Sect.
Rule-based models offer a possible route to address this limitation. They aim to model how process outcomes emerge through the interaction between the rules that define task properties and the design situation which changes as tasks are executed. Some meso-level work in this area built on the Signposting approach of Clarkson and Hamilton , which was discussed in Sect.
This model was extended through a series of Ph. Among other insights this model, called Extended Signposting, was used to show how both the probability and desirability of each route should be considered when planning a design process.
In comparison to Signposting, ATP offers more concrete criteria for selecting tasks, considering their roles in driving technical performance measures TPMs closer to specified targets. More recently, Wynn et al. To illustrate, the time spent on an FEA task would be influenced by the expected accuracy of boundary conditions, which would propagate through the task to influence the expected accuracy of its outputs. In this model, a design is progressed through iterative cycles which continue until uncertainty levels converge to acceptable values.
Wynn et al. This bypasses the requirement for an integrated overview of the process, which can be difficult to develop in practice. On the other hand, when compared to the approaches discussed in the previous two subsections, rule-based models are difficult to visualise and it is not clear how to validate all possible routes they allow.
Research towards addressing these limitations is reported by Clarkson et al. For the moment though, such models remain mainly of academic interest. Domain-integrating task network models explicitly integrate process models capturing an end-to-end flow of tasks with detailed information about other domains such as the product being designed. Eckert et al. For example, they might help to decide whether design changes should be accepted during a project, considering whether the design improvements would justify the additional time and effort in the development process.
These are extensions to the DSM which allow modelling of linkages between different types of element Kusiak and Wang a ; Danilovic and Browning ; Lindemann et al. Danilovic and Browning discuss application of DMMs to explore connectivity between the process domains of tasks, components, and teams. By analysing the domains independently and in combination, it is possible to identify mismatching structures.
For example, a team structure which does not reflect the decomposition of tasks in the process may contribute to communication overhead or rework Kreimeyer and Lindemann Sosa et al. Another element is using graph-theoretic metrics such as betweenness centrality and cycle count to develop insights about the importance of nodes and patterns in the network Kreimeyer and Lindemann Lindemann et al. Object-process methodology OPM provides an integrated representation of processes and objects using a formal graphical notation or equivalent formally structured sentences Dori Several types of structural link allow the modeller to connect diagram elements within the process domain or within the object domain, while several types of procedural link can be used to connect elements across these two domains.
OPM is a general-purpose methodology that has been applied in different contexts. Of particular interest to this review, Sharon et al. In their approach, a project is decomposed into a hierarchy of tasks and deliverables, considered concurrently. The OPM representation is then analysed to generate summary views useful for project management. Sharon and Dori further develop this method, arguing that it could help to avoid mismatches and inconsistencies between the models and documents used to manage a project.
Other models integrating product and design process information have been developed with the specific objective to support resolution of conflicts among design parameters. For example, the DEPNET approach stipulates modelling a process as it unfolds, along with the design information associated with each task Ouertani and Gzara The resulting trace constitutes a network of dependencies among information items, which can be used to assess the knock-on impact of design changes.
A similar approach is taken in CoMoDe, an object-oriented model intended to maintain a trace of the model versions that are created and used at each step in a collaborative design process Gonnet et al.
CoMoDe represents a hierarchy of process activities and constituent operations; requirements; the actors who perform each activity; characteristics of the artefact as it is evolved; and decision rationale. Gonnet et al. Overall, process-oriented conflict management seems a theoretical approach involving step-by-step capture of design history using rather complex representations. Although the potential is demonstrated by examples, the respective authors do not report evaluation of the proposed support tools in an industry context.
Focusing on coordination in major projects, Rouibah and Caskey develop an engineering work flow EWF approach based on identifying the engineering parameters whose values need to be determined—this can be partly constructed by reference to similar past projects. The parameters are linked into a network to represent their interdependencies, which can evolve during a project.
Parameters are also linked to the responsible parties. Six steps are defined to transition between successive hardness grades, to ensure that the change is coordinated among impacted parties. This approach seems to have strong potential to support the coordination tasks to ensure consistency and transparency during a project. In comparison to the approaches reviewed in Sects. While this potentially offers more insight, it also requires more information.
Consequently, it may be difficult to create large-scale models in such approaches and ensure their consistency Park and Cutkosky , as well as to visualise and understand the models once created.
There are many other approaches in this category. For focused reviews of integrated models and further discussion of their advantages and limitations, the reader is referred to Eckert et al. Finally, agent-based models ABMs have been developed that combine meso-level task relationships with micro-level models of agent behaviour.
Such models offer the possibility to study factors impacting a process in a more realistic context than the other models described in this section. For instance, they can incorporate factors such as organisational structures and the many non-design activities that project participants must attend to—such as going to meetings, chasing colleagues for information, and other coordination activity that emerges as a project unfolds.
In one influential example, the virtual design team VDT developed by Cohen , Christiansen and colleagues represents individual designers and managers in a project as information-processing agents. These agents interact by generating and responding to messages according to rules. Messages can involve passing design information between tasks and also the handling of exceptions, which occur when an agent must stop work and seek more information before they can complete their assigned task.
In the model, message handling depends on factors such as the organisation structure and communication tools available. Levitt et al. Some advantages of ABMs were discussed at the start of this subsection. In addition, it may be noted that ABMs can represent the decisions of situated actors and thus may be well suited to account for the responsive and emergent facets of the DDP Garcia In terms of disadvantages, developing an ABM requires complex configuration or programming of a specialised tool and may be beyond the reach of many would-be modellers.
Second, the models are each unique and do not lend themselves to graphical representation. As a result, their mechanics can be opaque except to their creator, which might lead to credibility concerns. Finally, although ABMs might be helpful to build understanding of the factors influencing DDP performance, they cannot easily be used to document or prescribe a process.
In contrast to other categories of meso-level model, they do not specify or analyse tasks in detail. Some abstract models conceptualise the design process as a series of tasks that transition in a progressive way between the different types of information or knowledge that are used as a design is created.
Overviews of the product-focused aspects of this work can be found in Buur , Andreasen , Eder , and Hubka Applying these concepts to the design process, Theory of Domains recently summarised by Andreasen ; Andreasen et al.
The theory states that designers establish these domains in the sequence listed above, noting that stepping back and forth between them is also likely Buur Within each domain, a design is described by multiple product models that can each be categorised on a two-dimensional grid: abstract vs.
Micro-level procedural models such as those reviewed in Sect. Reproduced from Andreasen with permission of the author. Theory of domains views design as a process in which design information is established through increasingly concrete domains. It provides a framework in which models and methods used during design can be positioned. Related to the theory of domains, Grabowski et al. The design process is seen as a series of operations in which a solution is progressively developed within its DWS by stepwise moves that can be categorised on three dimensions.
The first dimension is concretisation vs. For example, concretisation might move a solution state from functions to structures, while abstraction might move it in the opposite direction. On the second dimension, detailing vs.
On the third dimension, variation refers to searching for alternative solutions on the same level of abstraction, while its counterpart, limitation, refers to adding constraints that reduce the solution space. Grabowski et al. A design process is presented as a collection of synthesis tasks, which determine or create characteristics from desired properties, and analysis tasks, which determine properties from characteristics.
The model suggests that tasks are also influenced by external conditions, such as load cases, and can be supported through prescriptive methods such as those discussed in the previous sections. Key features of design that the model aims to encompass include: how the process is driven by the difference between desired and real properties; how the product definition becomes more complete over time as more characteristics are created and their values determined; how partial solutions can be integrated into an emerging design; and how iterations may be caused by conflicts, e.
The key distinction is that models in this category are created as mathematical or computational tools for research in which representative or synthetic cases are analysed to extract general insights—whereas the analytical models discussed earlier provide approaches that practitioners might in principle use to model, analyse, and improve their specific situations. One stream of work in this category focuses on developing mathematical models to study how concurrency may help to reduce lead time by bringing more resource to bear, at the cost of increased rework.
For example, AitSahlia et al. Their models demonstrate how the tipping point at which further increases in concurrency start to yield increases instead of reductions in process duration is determined by the probability of each task creating rework for others.
Hoedemaker et al. Other authors consider design reviews. For example, Ha and Porteus develop a mathematical model to study the optimal timing of such reviews during concurrent product and process design.
In this model, the desirable effects of frequent design reviews are to find flaws before they are incorporated into the design, and to validate interim product design work so that it can be released to process design, enabling concurrency.
This is set against the time required to set up and execute the reviews. Ha and Porteus show that the optimal frequency of reviews depends on whether the concurrency or quality issues dominate. Their model is extended by Ahmadi and Wang to also consider how resource is allocated to different design stages. In this case, the model is used to consider how the reviews should be scheduled with a view to minimising the risk of missing targets.
Considering this issue, Yassine et al. They use their model to show that this situation arises because teams that work concurrently on interdependent problems only coordinate periodically and thus often make design decisions based on outdated information. Braha and Bar-Yam focus on structural characteristics of the information flow network among tasks being worked concurrently.
They develop a model considering that when any task is solved, it is possible that this will cause any interdependent tasks to require iteration. They analyse task networks from several domains and find there are common characteristics. In particular, most tasks are not strongly connected, but those that are strongly connected are shown to be especially susceptible to such iterations. In one such model, Loch et al.
In two others, Huberman and Wilkinson and Schlick et al. Reproduced from Prasad b. Do not generally produce throwaway systems d. Answer: d. The prototyping model of software development is a.
A reasonable approach when requirements are well defined. The best approach to use for projects with large development teams. A risky model that rarely produces a meaningful product. Answer: b. The spiral model of software development a. Ends with the delivery of the software product b. Is more chaotic than the incremental model c. Includes project risks evaluation during each iteration d. Includes project risks evaluation during each iteration.
The concurrent development model is a. Another name for the rapid application development model. Only used for development of parallel or distributed systems.
Used whenever a large number of change requests are anticipated. The component-based development model is a. Only appropriate for computer hardware design. Not able to support the development of reusable components. Waterfall Model : It requires a well understanding and knowledge of requirements and technology related to it.
Advantages : It is very easy and convenient to implement the waterfall model. For implementation of small systems, it is very useful.
Disadvantages :. Skip to content. Change Language. Related Articles.
0コメント