Abstract

In the architecture, engineering, and construction (AEC) fields, computing and information technologies play an increasingly prevalent and complex role in day-to-day work. Consequently, educators must adjust and, in many cases, reimagine curricula and teaching methodologies to adapt to the changing landscape. Past research efforts led by the ASCE Computing Division Education Committee, formerly called the Task Committee on Computing Education of the Technical Council on Computing and Information Technology (TCCIT), have regularly surveyed AEC educators to understand computing trends in AEC curricula, with the latest survey taking place nearly a decade ago. This work presents the results of an updated survey that used this prior work as a springboard, providing timely insights into the computing skills and curricular barriers faced by AEC educators today. The results showed that the technical skills used by students have evolved, but the barriers faced in incorporating new skills into curricula have remained largely the same. In addition to comparisons with prior surveys, this work presents the results of an expanded, open-ended portion of the survey that explores educator perspectives on the future of the AEC workforce in a broader lens than used in previous surveys. Thematic analysis of these open-ended responses revealed themes that were common among responses and provided organization to the findings. For example, educators provided their vision of what competencies the future AEC workforce would need, which were thematically organized into a continuum based on the level of interaction between humans and technology. These results suggest an increasingly complex and evolving relationship between the AEC workforce and emerging technology, highlighting the need for educators to encourage the development of technological adaptability and agility. Overall, this work provides a systematic comparison of current educational practices in AEC computing with a decade ago to illustrate educational shifts and adds a prediction of AEC trends from experts in AEC education, providing crucial discussion of curricular transformations that will better position students for success in the workforce.

Introduction

Timely and effective use of computing and information technologies provides a significant competitive advantage in today’s architecture, engineering, and construction (AEC) community practices. At the end of the first quarter of the 21st century, computing and information technologies have reached inflection points (Chong et al. 2017), such as more cost-effective computing, the transience of artificial intelligence (AI), and the redesign of benefits of sociotechnical systems. Despite the prevalence and importance, incorporating these technology paradigms in AEC foundations and practices and their inclusion in the curriculum of AEC programs remains a challenge. It is imperative for AEC educators to be engaged in how to make these advances impactful for the next generations of students. Motivated by these challenges, efforts of the ASCE Computing Division Education Committee, formerly called the Task Committee on Computing Education of the Technical Council on Computing and Information Technology (TCCIT), focused on developing new frameworks for understanding the technology inflection points in civil engineering education. The core of these efforts is to provide insights into the impact of computing technologies within AEC education and raise questions about how to facilitate the implementation of these findings.
Having these points of departure, the committee explored opportunities and challenges of integrating computing literacies into AEC education. The resulting outcomes present the current landscape of computing in AEC education, revealing the trends of computing and technology that currently operate in the AEC discipline and programs and a vision for what this could look like in the future. The efforts presented here are intended to guide how educators within this community think about preparing their students for the needs of their future careers.
The fields of AEC are facing a variety of major challenges that will require future professionals in these domains to be prepared to work smarter, more efficiently, and more effectively than ever before. For example, the AEC fields are experiencing labor shortages in many locations that make securing human resources challenging or impossible, which may necessitate new methods for construction that rely less on manual labor (OECD 2023). Natural disasters are occurring with higher frequencies and intensities, which present urgent needs for rapid reconstruction after these events, or prevention before they occur. Additionally, the negative environmental impacts of traditional methods for delivering built infrastructure have spurred innovative building methods that can provide more sustainable performance. Finally, many regions of the world, including the US, are struggling to maintain aging, and at times crumbling, built infrastructure.
The recognition of these and other emerging challenges has spurred the AEC fields to begin adopting more innovative technologies to enable a better built infrastructure. Over the last decade, the industry has seen an uptick in technology adoption (Kalanithi 2022). Nevertheless, the industry remains less technologically advanced and still lags behind other industries in critical metrics like productivity and safety. For example, the construction industry has averaged significantly slower growth in productivity in the last two decades compared with industries like retail or manufacturing (McKinsey Global Institute 2017). These trends highlight the need for academic institutions with AEC programs to consider how the future needs of the industry will be addressed by shifting how current AEC students (i.e., the future industry leaders) are prepared with necessary computing and technology competencies to advance the evolution of this industry.
Within the context of this paper, two concepts are frequently used to guide the discussion of the future of computing in AEC education. The first is computational skills, referring to the ability to use computer-based technologies for tasks (in this case, AEC-related tasks), which include data science practices, or technology applications that combine statistics, scientific methods, and data analysis to extract value from data (structured and unstructured) (Dhar 2013). The second concept is computational thinking, which is a set of problem-solving methods that format problems in ways that computer systems can solve and involves computational concepts, computational practices, and computational perspectives (Brennan and Resnick 2012; National Research Council 2010). Computational thinking answers questions like “what do we need to understand about computers?” and “what must we do to get a computer to work for us?” (Denning and Tedre 2019). In short, computational skills are the abilities to use specific technologies, whereas computational thinking is the mindset that enables the development and utilization of a variety of skills.
This study presents survey responses from AEC educators from across the globe about their perceptions of current and future AEC-related computing competency needs, highlighting the opportunities for educational innovation within these domains. It explores faculty perceptions about the future of AEC fields through both closed- and open-ended questions. Through the use of quantitative and qualitative analyses, resultant trends are presented to indicate how educators envision the future of AEC and how they believe students should be prepared accordingly. Although these trends may indicate patterns in reports from faculty, the authors do not make claims about the best methods for realizing the suggested advances in education, because the specific teaching needs and implementation strategies can vary widely across different regions, institutions, and domains within AEC. Accordingly, the goals of this research are as follows:
Investigate the current status of computational concepts and skills in the curricula of AEC programs.
Explore faculty perspectives on the future computational needs of students entering the AEC workforce.
Explore faculty perspectives on the changes needed in the coverage of computing in AEC higher education curricula, and barriers to those changes.

Background

Previous research conducted by the TCCIT in 1986, 1989, 1995, and 2002 was focused on assessing the computing components in civil engineering curricula (Fenves and Rasdorf 2001), and more recently in 2012 and 2014 to assess the state of computing in AEC education (Gerber et al. 2015; Khashe et al. 2016). The 2012 survey aimed to assess the evolution of computing in civil engineering, determine whether computer science knowledge or skills should be prioritized in AEC education, and provide benchmarks for future evaluations of the state of computing in AEC education from educators’ perspectives (Gerber et al. 2015). The 2014 survey compared the 2012 TCCIT survey and evaluated the state of computing within the AEC curricula with respect to changes implemented within those 2 years (Khashe et al. 2016). Those two studies highlighted that computer skills are more important than computer science knowledge in the AEC curricula and identified the barriers to further incorporating computing into the AEC curricula, such as the lack of space in the curricula, insufficient resources, insufficient funding, and inadequate decision-making criteria for making curricular decisions (Gerber et al. 2015; Khashe et al. 2016).
Other ASCE TCCIT committees have presented works on grand challenges facing the civil engineering fields. The Data Sensing and Analysis (DSA) committee explored challenges within civil engineering that can be addressed with DSA research, finding a variety of issues ripe for improvement, ranging from high building energy consumption to groundwater depletion to infrastructure resilience (Becerik-Gerber et al. 2014). The Visualization, Information Modeling, and Simulation (VIMS) committee found grand challenges in the construction industry that included as-built modeling problems and a disconnect between research and practice as top unsolved issues (Leite et al. 2016). These surveys and papers provide a clear and necessary picture of the ways in which the AEC industry is evolving and how computational skills and applications are becoming increasingly important in addressing current and developing challenges.
In addition to the ASCE TCCIT surveys, prior studies have explored various trends of computing that have led to technological and educational changes in AEC. Several studies in the last few decades have explored computing trends such as building information modeling (BIM), which has risen to prominence as a key digital technology in the AEC industry as well as in AEC education (Cooksey 2011; Fruchter et al. 2018; Wang et al. 2020). Similarly, the integration of automation and robotics in AEC curricula has been a topic of discussion for several decades (Boles and Wang 1996). In the 2015 survey of US construction industry professionals, BIM, augmented and virtual reality (AR/VR), wearable technology, laser scanning, and drones were frequently mentioned as areas of interest in future technology (Holt et al. 2015). More recently, a 2022 survey of accredited civil and environmental engineering departments in the US found limited incorporation of modern data science languages into current curricula and identified barriers such as educators’ level of knowledge and lack of space in the curricula to incorporate these skills (Grajdura and Niemeier 2022). Similarly, a review of AEC literature conducted in 2022 identified AR/VR, laser scanning, drones, AI, robotics, three-dimensional (3D) printing, digital twins, internet of things (IoT), blockchain, and off-site construction as emerging technologies in AEC industry, and emphasized the need for further research related to teaching new technologies so that educators could share their experiences in incorporating them into AEC curriculum (Debs et al. 2022).
In engineering education–specific literature, researchers have explored what civil engineering students think about what skills they may need for success, with students ranking communications skills and teamwork among top skills (Polmear et al. 2020). Another study explored accreditation criteria from the perspective of participation and control dynamics between teacher and student, noting the more active role that students play in learning, requiring adaptation in teaching methods (Forcael et al. 2022). In these studies, computing was mentioned in a general sense, but the studies did not take a deep dive into the computational aspect of educational preparation. Other work has considered specific methods to prepare students for the changing digital landscape, like project-based learning boot camps, flipped classrooms, or gamification (Fruchter 2018; Ilbeigi et al. 2023; Safapour et al. 2019; Torbaghan et al. 2023). These studies suggested specific educational interventions, and, although helpful, they did not specifically dive into the nuances of the developing needs specific to AEC education.
Despite substantial prior research efforts in related areas, there is a noticeable knowledge gap in the field because there has not been a comprehensive study focusing on the state of computing in AEC education in the last decade from educators’ perspectives. This gap in research has limited our understanding of how the AEC industry has adapted to technological advancements and the extent to which computing tools have been incorporated into AEC curricula. Additionally, the current needs of the AEC industry may not be accurately reflected in the older literature, making it imperative to conduct a study to understand the needs of the AEC industry in the near future. Accordingly, this study, by building on the previous TCCIT surveys, explores the current and future computing skill needs of the AEC industry using both closed and open ended (nonprescriptive) survey questions.

Research Questions

To address these knowledge gaps, this work focuses on the following research questions:
RQ1: What computing competencies are currently targeted by AEC educators, and how do these competencies compare with previously reported findings?
RQ2: What computing-related competencies do AEC educators envision their students needing in the next 5–10 years that are not currently required?
RQ3: Based on the competencies covered and the future envisioned, what changes will need to be made to AEC curricula to adequately prepare students to meet industry needs?

Methodology

To identify patterns in perceptions about computing competencies reported by AEC educators across the globe, a survey methodology was adopted for this work, as outlined in Fig. 1. The authors developed a survey based on prior ASCE surveys with similar aims (Gerber et al. 2015; Khashe et al. 2016), and expanded beyond only convergent questions to also include open-ended questions aimed at eliciting input from educators about emergent needs in these domains. After the survey was validated through multiple iterations among the authors, it was distributed to various listservs targeting AEC educators. Responses were analyzed using both qualitative and quantitative methods. The steps of the research methodology are presented in detail in the subsequent subsections.
Fig. 1. Flowchart of study methodology steps including design, distribution, and analysis of survey results.

Survey Design

To analyze educator perspectives on computing skills students need for success in the future, this study adopted a mixed methods approach, using self-reports of both quantitative and qualitative factors. Initially, survey questions were designed to elicit some background information, such as the respondents’ institution, country, job title or role, level of students taught (undergraduate, graduate, or other), and their program or academic area (architecture, engineering, construction, or a combination of these). This information helped the authors sort responses and identify response trends based on relevant factors. Additional personally identifiable demographic information was not collected per the approved institutional review board (IRB) protocol corresponding to this survey and methodology. The anonymous nature of the survey was intended to encourage faculty participants to be open and honest in their responses, even if those responses may reflect negatively on their current institutions. The Supplemental Materials contain the complete list of survey questions.
Following the questions asking background-related information, the survey continued with several closed-ended questions. These questions were largely derived from the Gerber et al. (2015) survey. This approach enabled a comparison between some of the results of this study to the trends reported by Gerber et al. (2015) a decade ago. In addition, some of the components of the parallel questions were modified to align with current technological advancements. More specifically, these closed-ended questions targeted responses about (1) topic areas of advanced computing literacies relevant to students; (2) programming languages used or taught; (3) applications of computing technologies covered in curricula; and (4) barriers to incorporating computing into curricula. These topics covered in closed-ended questions allowed for a quantitative comparison of current educator responses and a direct comparison of responses with previously reported research from a decade ago.
In addition to the closed-ended questions related to specific curricular content, the survey also included several open-ended questions. These questions elicited responses from participants about their visions for the future of computing in AEC fields. They were asked to describe what students will eventually need to do in their future careers that may be different from current needs. Then they were asked to explain what they believed would be required for students to learn to effectively prepare them for these future careers. Finally, participants were asked to define what educators may need to do or change to support students’ learning.
Prior to distributing the survey, the questions were iteratively reviewed with various faculty members. This validation process was performed to ensure that the questions incorporated in the survey elicited the kinds of responses intended and did not inadvertently confuse respondents. Through this process, several small modifications to questions’ texts were made, but no major structural changes were required to achieve a consistent understanding of the survey questions. After iteratively validating the survey and resolving concerns related to specific question text, the survey was ready for distribution to AEC educators.

Survey Distribution

The survey was distributed through global listservs and contact methods tied to existing professional organizations in AEC disciplines, including ASCE Education Committee, ASCE Computing Division Committees (formerly TCCIT), the Construction Research Congress (CRC), Associated Schools of Construction (ASC), Accreditation Board for Engineering and Technology (ABET), National Architecture Accrediting Board (NAAB), American Council for Construction Education (ACCE), Intelligent Computing in Engineering (EG-ICE), Co-operative Network of Building Researchers (CNBR), the International Association for Automation and Robotics in Construction (IAARC), International Conference on Computing in Civil and Building Engineering (ICCCBE), International Council for Research and Innovation in Building and Construction (CIB W78), and the Asian Group of Civil Engineering Informatics (AGCEI).
A global list of targets was selected to incorporate feedback from individuals in various global regions because conditions and operations vary by country and by the needs of different nations. Targeted participants were faculty (department heads, program directors, faculty, and instructors) associated with higher education institutions, both US-based and international, and whose content related to architecture, architectural engineering, civil engineering, construction management, or construction engineering and management. Because different disciplines have different accreditation requirements, expectations, and curricular needs, faculty from each of the AEC disciplines were included to provide perspectives from each discipline. The survey was actively collecting responses from August 30 to November 30, 2022.

Methods of Analysis

After collecting all survey responses, the data were organized and analyzed. Any surveys that indicated a No response to the IRB consent form were removed from consideration. Incomplete surveys were screened to determine if respondents had provided at least one identifying characteristic (institution, country, role, level, or academic area) and one response related to the future of AEC education content. If so, these surveys were included in the data set. If not, they were excluded. After organizing and filtering the data for completeness, the results were analyzed.

Descriptive Statistics

The closed-ended questions were analyzed using descriptive statistics, enabling comparison with previous studies on this topic and enabling the identification of clear quantitative patterns across respondents. This method helped to illustrate trends regarding the most cited skills that students may need related to computing in AEC domains. This approach also allowed the authors to identify any trends that differed between each of the AEC domains. Finally, the use of descriptive statistics in the closed-ended questions allowed the findings of this work to be compared with the results presented by previously published work from a decade ago that targeted similar insights from AEC faculty (Gerber et al. 2015; Khashe et al. 2016).

Thematic Analysis

For open-ended questions, a thematic analysis approach was used to identify and document emergent themes. Thematic analysis allows the identification of common themes and patterns from respondents, even if the exact verbiage varies. Because there is a wide variety in the backgrounds and disciplines of respondents, this analysis enables meaningful findings to emerge from open-ended data. This type of analysis, although less common than qualitative data analysis in AEC disciplines, has been used in other literature analyzing open-ended and opinion-based data within AEC disciplines (Hartless et al. 2020). The topics analyzed using this approach were first, the vision of respondents for what students will do in the future, second, the changes needed in curricula to teach these future skills, and third, the support needed to enable those changes.
The Braun and Clarke framework for thematic analysis was followed (Braun and Clarke 2006), which is comprised of six steps: (1) familiarizing yourself with your data, (2) generating initial codes, (3) searching for themes, (4) reviewing themes, (5) defining and naming themes, and (6) producing the report. In this type of analysis, codes are generated directly from participant responses, and these codes are organized into thematic categories, reviewed, and then defined.
In this study, both semantic themes (based solely on the surface meaning of participant responses) and latent themes [exploring underlying ideas that are “theorized as shaping or informing the semantic content of the data” (Braun and Clarke 2006)] were identified for analysis. This analysis approach enabled the findings to offer insights into topics that may have fallen outside of the scope of the closed-ended questions and may also help to provide detail on how faculty envision the skills identified being applied by students in their future careers.
Thematic analysis is a qualitative approach to understand the data and the underlying ideas that emerge from the data. The results of thematic analysis include codes and themes as described previously, and results take a more discussion-based format than the typical quantitative methods that much of AEC research. Due to the exploratory nature of the thematic analysis and the emergent nature of the themes, this work avoids assigning qualitative frequencies to the emergent themes, presenting instead a framework of ideas regarding the current and future state of AEC education. This approach avoids falsely assigning importance to an idea solely on the basis of prevalence, especially in this survey, which asked participants to make predictions about the future needs of the AEC workforce and the connection with higher education. Thus, the thematic analysis of the open-ended questions within this survey results in a list of codes and themes that emerged from each analysis, with each of the three questions being subjected to its own unique analysis.

Results and Discussion

Descriptive Statistics

The collected, filtered responses resulted in a sample size of 84. The results included responses from 67 unique institutions, with no more than three respondents from a single institution. Specific institutions listed were used by the researchers to define aggregate trends in responses, but were not listed here to protect the anonymity of respondents, especially those at smaller institutions that could be identified through the listing of these names. Respondents also indicated geographic location, academic role, level of students taught, and affiliated discipline as follows:
Geographical location: North America (54% of respondents), Europe (21%), Asia (including Turkey) (13%), Africa (6%), South America (5%), and Oceania (1%).
Role: Professor (29%), assistant professor (23%), associate professor (15%), lecturer (13%), administrator (1%), and other (19%). The “other” responses included two senior lecturers and a list of single instances of a variety of other titles (e.g., invited senior professor, professor of practice, professor emeritus, and research associate).
Level of students taught: 93% of respondents taught undergraduate students, 80% taught graduate students, and 12% indicated that they taught “other” students such as professional certification students, postdoctoral, and Ph.D. students (some schools refer to only master’s students as graduate students and Ph.D. students with other terminology, whereas others include Ph.D. students in the graduate student category).
Affiliated discipline: 18% of respondents were associated with architecture, 64% with engineering, and 81% with construction. Several respondents taught students within more than one AEC discipline, which explains the percentage breakdown of responses totaling more than 100%.
The results of this survey showed interesting patterns with respect to past comparisons, current trends, and future vision. The closed-ended questions provide a look back in time to see what has changed and what persists with regard to technologies, topic areas, and barriers, utilizing the 2012 and 2014 surveys as benchmarks for comparison. The open-ended questions turn away from the past and look forward to what the future might entail regarding the needs for the future AEC workforce and how AEC education fits into those needs and challenges.

Answers to Closed-Ended Questions

The results from this survey were compared, when parallel, with the two previous studies, with data from 2012 (Gerber et al. 2015) and data from 2014 (Khashe et al. 2016) that were used to inform some of the questions in this survey. In particular, direct comparisons can be drawn between the results of the questions regarding programming languages and the barriers to implementing new advancements in the curriculum.
The survey results presented by Gerber et al. (2015) focused heavily on computing skills and broke down in detail the specific skills that were relevant at the time. Now, only a decade later, there are notable differences in the skillsets required, with new and emerging technologies changing the landscape of learning.

Advanced Computing Literacies

Participants indicated which advanced computing literacies they considered relevant to their students’ future success in the industry. The majority of respondents indicated that data analytics, programming, human–computer interaction, and artificial intelligence/machine learning were technical competencies that students would need for workplace success (Fig. 2). Those who responded “other” described a variety of skills, including computer-aided decision-making, natural language processing, and digital twins.
Fig. 2. Percentage of respondents who indicated each advanced computing literacy would be relevant to their students’ future success.
There is no perfect parallel to this question in the 2012 and 2014 surveys, but somewhat similar questions were posed in each. The 2012 survey explored the importance of various computing abilities in AEC education, and the results cited taking advantage of commercial tools in architecture, construction management, programming, and algorithms in engineering programs to be most important. Meanwhile, machine learning, distributed computing, and network science were rated as the least important in architectural and construction management programs. The 2014 survey anecdotally described respondents’ answers to how to better prepare students for future jobs, with the top recommendation being to improve computer skills and computer science knowledge.

Programming Languages

Programming emerged as a prominent technical competency, and an additional question probed deeper into the specific programming languages taught or used in AEC curricula. When asked which programming languages would be relevant to AEC students entering the workforce, respondents indicated which languages were taught in the curriculum (Teach) and which languages students used in their academic work (Use). Almost half the respondents indicated that students used Python in their academic work, although only about a third of respondents said it was formally taught as part of the curriculum. The second most common language was MATLAB, which was taught and used nearly the same amount by students. The C programming languages were also common, with over a quarter of respondents indicating that students used at least one of these languages in academic work. And several other languages were indicated with each being used by students more than taught in the curriculum.
The 2012 and 2014 surveys asked a similar question and reported a rank order of programming languages as shown in Fig. 3. Findings include the following trends:
Python jumped to the top spot in Use, but was second in Teach, indicating that there is high student demand for using this language, but it is not as frequently taught as it is utilized by students.
MATLAB retained prominence as a commonly used language, jumping to second most prominent in this survey versus third most prominent in the study by Gerber et al. (2015). For MATLAB, the Use and Teach results were similar, indicating that demand by students seems to be met within the curriculum.
C languages retained prominence, and Visual Basic remained relatively unchanged as well.
Other languages saw a decline in prominence. Respondents indicated that Java was used less, and that FORTRAN and HTML were no longer included and have lost prominence in these fields.
Some additional languages appeared in the ‘other’ category, including R, SPSS [which was also included in the Gerber et al. (2015) survey], mathCAD, Power BI, Prolog, Delphi, and Excel.
Fig. 3. Ordered frequency of programming languages from the 2012, 2014, and 2022 surveys with a bar chart of percentage of respondents who indicated that their students were taught (teach) or currently used (use) each programming language on the right. The duplicate HTML ranking was original to the published article.
Overall, the specific languages being used and taught have shifted, but the need for coding competencies continues to be recognized. On all results in this question, Use was higher than Teach, indicating that students need to develop skills in these languages in order to apply them even if they did not receive formal training in the curricula.
This shifting of specific programming languages is not surprising. New languages emerge as favorites and usage tends to follow, indicated by the obsolescence of FORTRAN and the current emergence of Python as a language of choice. The languages needed depend on and evolve with the data that are available and being used for decision-making. Less important than the actual language is the ability to apply a programming workflow regardless of language, such as laying out the data flow, breaking the problem into solvable blocks, applying programming patterns, and debugging. Choosing an appropriate tool or language for the task is an ability that will remain relevant as specific languages emerge and evolve. These ideas are pieces of the computational mindset that have been a consistent thread throughout these results.
Part of the computational mindset is making the distinction between data processing/movement, data exploration, and data representation. For example, some of the items that emerged in the results are actually data interaction/exploration environments (R and SPSS) or data presentation environments (Power BI). Fundamentally, people use “languages” to communicate intent in a variety of contexts and adapt language as needed to effectively communicate. In the context of programming languages in AEC, the important thing is to start with understanding what needs to happen with the existing data, then determine the language that best enables it.
Another interesting evolution is the rise of visual programming environments like Dynamo and Grasshopper, which theoretically make programming easier to understand and foster exploration of a solution space through multidimensional parametric modeling and optimization. These environments are very explicit about mapping out the flow of data from specific input to desired results. The focus then becomes the flow of data, rather than learning specific language skills. However, even though these environments make it easier to engineer the flows, the fundamental need to plan, subdivide, implement patterns, and debug is still there, which again makes up part of the computational mindset and requires, at least at this current point in time, the judgment and guidance of a knowledgeable human user.

Applications

In addition to computing literacies and technical competencies with programming languages, participants also indicated specific applications of computing technologies that are currently covered in their program curriculum (Fig. 4). Most respondents indicated that BIM (83%) and computer-aided drawing (CAD) (77%) were taught within their curriculum. Analysis/simulation/engineering calculations (57%) also emerged as a prominent application of computing technologies within existing AEC curricula. Data management/decision support (51%) and visualization/AR/VR/mixed reality (MR)/extended reality (XR) (52%) were also taught in a majority of the respondents’ respective AEC curricula. Analysis/simulation/engineering calculations (57%), and data management and decision support 51%) also emerged as prominent applications in the AEC curricula. Sensing/3D scanning/unmanned aerial vehicles (UAVs) (drones) (43%), parametric design (32%), and algorithms/automations (32%) appeared in a minority of responses.
Fig. 4. Percentage of respondents who indicated that each application of computing technologies was covered in their program’s curriculum.
The 2012 survey also asked about computing technologies and reported a rank order of application with the following order: BIM (86%), visualization (76%), parametric design (74%), computer-aided design (71%), simulation (including mechanics) (63%), algorithms (57%), analysis (56%), human–computer interaction (52%), automation: scripting repetitive tasks (49%), and sensor networks (48%). Although the questions asked were similar, the exact components of each question differed slightly based on the relevance of technologies at each time period. For example, the 2012 survey included human–computer interaction in this group of topics, whereas this survey included human–computer interaction in a separate question regarding advanced computing literacies (Fig. 2). More information regarding the specifics of each question has been given by Gerber et al. (2015), and the Supplemental Materials provide the questions specific to this survey.
The 2014 survey included a Likert-style question asking respondents about the importance of various computing skills and their coverage in the curricula. Although many of the categories were different from this survey and as such are not presented for comparison, it should be noted that CAD ranked first and BIM ranked second in the competencies that were covered, and that order was reversed (BIM first and CAD second) when considering importance in the 2014 survey. A full list of the results can be found in the respective survey papers (Gerber et al. 2015; Khashe et al. 2016).
Although BIM and CAD retained importance, the current survey showed movement beyond a focus on static presentation in the form of word processing or presentations to a deeper use of technology to more fundamentally change the approach to teaching through concepts like simulation, data analytics, and VR. The S-curve of technology adoption describes the process of how most technology sees slow adoption, a rapid growth upon reaching a critical mass of adoption, then a flattening of the curve again as the technology becomes a norm. It is likely that some of these technologies (word processing and presentations, for example) have reached the status of ubiquity, becoming innate skills that are used from childhood, and were thus not even included in this iteration of the survey.
Although not yet as ubiquitous as word processing, a similar process is happening with BIM, where it is evolving from being a specialized modeling skillset to be a core communication tool and expression of design proposals. From an educational standpoint, BIM will likely continue to be taught as an essential toolset, especially in technical schools, but in many universities, the emphasis is shifting to how we can use data-rich models (created with BIM tools) to enhance/transform our processes and tackle new problems that we can now see.
Perhaps the use of simulation tools is seeing a similar evolution. Rather than being a specialized separate task, with our data-rich models, we can easily simulate the predicted performance of a design and use this feedback in an iterative loop to improve our designs. In this way, simulation is moving down the curve to becoming more mainstream, and we are approaching it as a much more ordinary process used naturally in the production and understanding of results. The notion that there are multiple S-curves and what they are compressing is useful for understanding the evolutions of computing technology within the AEC industry and the response within AEC education.

Barriers

Although some computing literacies, technical competencies, and specific applications have been incorporated in the curricula of many universities, coverage of these competencies is not ubiquitous, and new competencies are constantly emerging as technology evolves. Respondents indicated the barriers that they experience in incorporating computing competencies into AEC curricula (Fig. 5), with the most prominent being no room in the curriculum (55%). Other barriers included inadequate resources (44%), not an accreditation criterion (40%), lack of teaching assistant (TA) support (40%), inadequate funding (39%), inertial resistance (38%), not considered important (35%), lack of needed infrastructure (32%), insufficient student demand (24%), and other (19%).
Fig. 5. Ordered frequency of barriers to incorporating computing into their curricula between the 2012 and the 2022 surveys (no 2014 data available), with percentages for the 2022 survey shown on the right bar graph.
The 2012 survey asked a similar question and reported the following barriers, in rank order, and with percentages of respondents who indicated that these were a barrier: no room in curriculum (63%), inadequate resources to make the curriculum change (46%), insufficient student demand (41%), not an accreditation criterion (40%), not considered important (37%), no one to teach it (33%), inadequate funding (31%), lack of teaching assistant support (31%), and other, especially inertial resistance (11%). The 2014 survey included an anecdotal discussion of barriers but did not provide a rank order for comparison. However, that no room in the curriculum, inadequate resources, lack of adequate funding, and not an accreditation criterion were mentioned as the main barriers in this 2014 survey.
One notable trend worth discussing is that lack of student demand used to be the third most prominent barrier (41% of respondents in the 2012 survey), but now is least prominent (24% of respondents in the 2022 survey) of the given options in the question (besides “others”) (Fig. 5). This leads to the idea that students are seeing the relevance of emerging technologies and have an appetite for learning that AEC curriculum is not fully meeting. This is supported by the data in Fig. 3 that show more students using each programming language than being taught it.
Overall, at a high level there are institutional barriers that block those who see to change (lack of resources and money); systemic barriers (no rooms in the curriculum, typically driven by accreditation requirements), and personal barriers (inertial resistance, i.e., educators do not see the need or do not want to change). Overcoming these barriers/objections may require reframing the problem. Rather than replacing engineering content with separate computing courses, the AEC educational body may need to rethink how to adapt teaching the engineering content relative to the changes in how it is approached using the new computing technologies.

Answers to Open-Ended Questions

The prior surveys that have previously guided the discussion of results presented convergent data, where respondents answered multiple-choice questions, often with the “other” field being the only text response available. Although some of these questions presented forward-thinking results, like a table showing the Top 10 Important Topics for Future AEC Education by Khashe et al. (2016), this survey aimed to broaden the scope beyond understanding the state of AEC education in a convergent manner to allowing open-ended discussion through three text response questions. In addition to the trends demonstrated by the closed-ended questions presented previously, the open-ended questions provided insights in the form of emergent themes in three areas, including participants’ vision for the future of the AEC workforce (Vision), changes needed in current curricula (Curriculum), and support (Support) needed to enact those changes.

Vision

The survey captured the respondents’ views on the future of computing and technologies integrated within AEC disciplines. Respondents proposed a wide variety of new technologies that they felt will be relevant to the future workforce with a wide range of the level and type of human interaction needed to leverage these technologies. It was observed that the responses both explicitly and implicitly addressed the relationship between students (humans) and the computing competencies (technology) that they would be involved with in future work, lending to a continuum-based interpretation with a resulting framework presented in Fig. 6. Initial codes emerged (taken directly from the participant responses) and are presented as circles in Fig. 6.
Fig. 6. Thematic analysis of responses regarding vision for the future of the AEC workforce.
These codes were then thematically grouped and organized based on the amount of human or technological involvement, and these thematic categories are indicated by the rectangles in Fig. 6. The thematic categories were organized on a continuum based on the ratio of involvement from a person (human) with the involvement from technology (whether hardware-based, software-based, or a combination). The leftmost categories in Fig. 6 include activities generally led by humans, the middle categories include activities where humans and technology work in tandem, and the rightmost categories include activities where technology takes a more prominent role, replacing what once was a human task. The thematic organization that emerged was based on the basic collaboration between humans (learners) and technologies that will augment their capabilities in the future (e.g., how technology augments the learners’ skills and abilities, how the interaction of humans and automation is facilitated, and how technology assists human social interaction and communication). The categories reflect the evolution of computing technologies within AEC education.
Responses included in the rightmost categories in the Fig. 6 do not indicate that humans would be completely eliminated from a process; this placement indicates that a specific part of the process listed may soon be led by a technology, and this would enable humans to dedicate intellectual effort elsewhere.
Within the Generally Human-Led Activities grouping, the thematic category of professional skills refers to a human attribute or mentality that provides benefit in a work scenario and does not directly relate to or depend on technology. Codes falling into this category include Leadership/Decision-Making, Focus, Business Intelligence, Working with Other Humans, Self-Study, and Problem Solving.
Near-Term Planning for Technology groups actions that humans can take now or in the near future to plan for technology usage that exists in the current work environment, such as choosing the appropriate technology for a given task, with codes including Choosing Technology for a Given Task, and Managing Data.
Long-Term Planning for Technology refers to actions that humans will take to plan for technology that does not exist today, but will in the future work environment. For example, responses indicating a need for students to adapt to new technology as it is invented would fall into this category, and codes included Building in Extreme Environments, Adapting to New Technology, Awareness of New Technology, and Assessing Return-on-Investment (ROI) of Technology. All of these actions included in the Generally Human-Led Activities grouping are believed to be critical to students’ success in leveraging technology in the industry, but are not intrinsically related to the use or operation of technology.
Within the Human-Technology Partnership grouping, Human Operates Tech describes situations where a person’s output or actions enable a technology to serve its intended purpose in performing work, such as flying UAVs with codes including Communicating through Technology, Programming, UAVs, Sensing/Scanning, and BIM/virtual design and construction (VDC)/Digital Twins. Tech informs human refers to situations where a technological output presents information that a human uses or interprets to perform work, such as using the outputs of an AI platform to support decision-making, with codes including Artificial Intelligence/Machine Learning Decision Support, Data Analysis, Augmented Reality/Virtual Reality, and Simulations.
Human and technology in tandem includes situations where humans and technological elements co-depend on the output from the other in order to perform work, such as side-by-side human-robot partnerships (cobots), with codes including Cobots, Operating Autonomous Equipment, and Human–Computer Interaction (HCI). The responses in these thematic categories highlight the aspects of future AEC work where faculty suggest a need to enable students to work collaboratively with emerging technologies.
Within the Technology-Led Activities grouping, Automation in Computing describes actions involving computer calculations and processing that are led by technology, such as generative design or AI performing estimating, with codes including Generative Design, Parametric Design, and AI/machine learning (ML) Estimating/Analysis. Automation in Assembly refers to actions involving physical assembly that are led by technology, such as robotic fabrication of construction assemblies, with codes including Robots and 3D Printing. These responses indicate aspects of computing in AEC domains where tasks traditionally completed by humans will be automated in the future. This categorization does not imply that AEC professionals will be removed completely from these tasks, but the role that humans will play within these tasks may be starkly different from the roles they currently or traditionally have played.
The results of this question illuminated the increasingly ubiquitous presence of technology and the complicated relationship that this presents for people and technology. A broad theme from this survey is a consideration of the human role in symbiosis with technology. The future will likely see some replacement of human roles, providing more opportunities for human-technology partnership, and likely persistence of roles that humans do best (e.g., leadership and decision-making). Students need to acquire an awareness and adaptability to the new role of humans in the technological landscape.
This continuum-based framework of interaction between humans and technology is often used to conceptually explain the varying levels of human interaction and attention needed when performing actions or following processes in tandem with technology (Bakker and Niemantsverdriet 2016), which parallels, perhaps, Milgram and Kishino’s well-known continuum between the physical human world and a totally virtual world (Milgram and Kishino 1994). Recent work has applied the human-technology continuum framework to the design process, with applications ranging from semiconductor chip design to video game creation (Seidel et al. 2018). Instead of considering a specific technology or process, this work takes a high-level view of this framework, considering a wide range of tasks and technologies, bounded specifically within the context of AEC education and career opportunities.
This framework provides an organization for the responses of faculty today, based on their knowledge of the present and conjectures for the future, and can be adapted as to future technologies and needs arise. There will certainly be developments that no respondent today can foresee, and the framework is not limited to the technologies of today or the technologies that people today can imagine, but rather provides a dynamic lens for understanding the place of any technology as it relates to the human workforce. This question elevates the discussion beyond specific skills to articulate the ways that humans and computers can work together in the future (replacement, partnership, and augmentation) and raises questions regarding what the benefits will be and for whom. Regardless of what students will be doing in the future, there is first a need to understand the most beneficial interactions/partnership models as a prerequisite step in creating the appropriate mentality for effective technology adoption and a healthy computational mindset. The specifics of how to develop the needed toolsets and skill sets naturally follow. By consciously considering the relationship of people to technology, AEC educators can lay the foundation for considering the question of how they might successfully teach the needed skills and encourage a computational mindset in students.

Curriculum

In response to the open-ended question regarding future curriculum, faculty frequently reported a need to augment current teaching practices with new learning content relevant to the future needs of the profession. The two semantic thematic categories that emerged from the responses to this open-ended question were content and method as indicated in Table 1, followed by more detailed descriptions. Of the respondents, 62 included substantive responses to the curriculum question. The percentage of these responses associated with each code is included in Table 1. Although the prevalence of each code is interesting in providing insights into the collective mindset of survey respondents, it does not equate directly to importance and should not be interpreted as such.
Table 1. Thematic categories for the question “what do students need to learn” (n=62)
ContentMethod
CodeDescriptionFrequency of response (%)CodeDescriptionFrequency of response (%)
[C1]Develop curriculum to teach specific software (e.g., R)10[M1]Add entirely new courses or adopt from other departments10
[C2]Develop curriculum to teach general skill sets (e.g., data analytics)56[M2]Integrate new concepts into existing courses10
[C3]Guide students to develop technical or computational mindset/mentality29

Note: [Ci] = content; and [Mi] = method.

Content
Many responses focused on curricular content (the “what” of the curriculum). The responses varied in how prescriptive they were, with some of these responses mentioning specific technical/computational skills that students needed to acquire, down to the specific software (e.g., PowerBI, O-Notation, or Procore) (Code category C1), whereas others more broadly mentioned skill categories such as programming, data analysis, and utilizing AI (C2). However, some responses expanded beyond prescribing specific content and instead proposed a need for students to develop technical/computational thinking (C3), agnostic of specific skills. For example, one response proposed that students needed to “understand how technology support[s] ideation, collaboration, and coordination in practice settings.”
Finally, some respondents mentioned the need for students to develop professional skills, like “creative thinking and problem-solving” and “effective communication with humans and computers.” This thematic category had a notable crossover with the content of the Vision question and other open-ended-response questions, not surprisingly suggesting that classrooms should begin to expose students to skills and ideas needed in future careers.
Method
Another common theme within the responses was a focus on the proposed method of modifying the curriculum (the “how” of curricular content delivery), with two principal strategies emerging. One strategy suggested adding courses related to the listed competencies (or using courses from other departments) (M1), and a second strategy suggested integrating these technologies throughout existing courses (M2).
Responses to the survey illuminated many of the challenges in the existing AEC curricula. There is a need for the incorporation of new computing trends (technological innovations and new topics) to balance rapid advances in computing, the evolution of the AEC industry, and current AEC education. At institutions with substantial resources for teaching innovations, these barriers may be addressed by investing heavily in curriculum transformation. However, results show frequent concerns about insufficient credit hours for adjusting to new computing needs and insufficient resources for faculty to make changes.
Some respondents suggested incorporating these topics throughout their curricula, but the challenge of lack of faculty support remains. For institutions with resource constraints, faculty may consider implementing student-led approaches to incorporating novel computing competencies, where instructors do not try to learn every new software or technological tool to subsequently teach to their students. Instead, instructors may serve as guides or coaches to help students learn to teach themselves these new tools. This approach would require more effort from the students but would allow faculty to demonstrate in the classroom the competencies most critical to the human part of the human–technological interaction.
Because student interest and demand for these competencies seems to have risen based on the survey responses, there is an opportunity to use this interest to drive the needed changes in AEC curriculum. The exact methodologies on how to do this will be left to future research.

Support

A common thread throughout the responses was that faculty felt a lack of resources and support necessary to implement the changes they see as necessary. Of all respondents, 60 included substantive responses to the Support question. Table 2 includes a presentation of the frequency of each code within these responses. The two semantic categories that emerged in this area were Resources and Stakeholders (Table 2).
Table 2. Thematic categories for the question “what do faculty/departments/institutions need to support this” (n=60)
ResourcesStakeholders
CodeDescriptionFrequency of response (%)CodeDescriptionFrequency of response (%)
[R1]Funding45[S1]Accreditation agencies10
[R2]Personnel22[S2]University/department leadership13
[R3]Shareable content8[S3]Industry15
[R4]Software licenses8[S4]Faculty13
[R5]Training18
[R6]Flexibility15
[R7]Accreditation requirements8
[R8]Lab space/infrastructure22
[R9]Recognition3

Note: [Ri] = resources; and [Si] = stakeholders.

Resources
Most responses listed the resources that were believed to be necessary to support the changes needed in curriculum. These resources included funding (R1) (grants, sabbaticals, summer salary, incentives, course buyouts, and student scholarships), personnel support (R2) (TAs, postdocs, professors of practice, and new professors), case studies or other shareable course content (R3), software licenses provided by department or software manufacturers (R4), training/exposure to new technologies (R5), flexibility in curriculum (R6), accreditation requirements for new computational competencies (R7), lab space and equipment/infrastructure (R8), and recognition for faculty efforts (R9).
Stakeholders
Some responses listed the stakeholders and decision-makers whose support they felt would be necessary to initiate or implement the needed changes. These key supporters and implementers included accreditation agencies (S1), department or university leadership/management/administration (S2), the AEC industry (S3), and faculty themselves (S4) (new faculty with experience and professors of practice).
One of the motivations behind the structure of this current survey was to expand the scope from only specific skill-based discussions to also include computational mindsets in an open-ended format to allow for discussion of topics that are not as easily categorized or quantified. Thus, for the open-ended portion of the survey results (Vision, Curriculum, and Barriers), no direct comparison is available with previous work, so the discussion of these open-ended responses focuses on the unique contributions of this survey alone rather than a comparison with previous work.
Similar to the discussion regarding barriers, this survey concluded that a variety of resources are lacking in implementing real change, but there is no real agreement in the right stakeholder for enacting these changes. The scope of this paper is not to claim to have a solution to the specific institutional transformations or teaching methodologies that should be implemented, but rather to identify persistent issues and provide a framework for understanding “what” needs to be changed first before exploring the “how” this should be done.

Limitations and Future Work

As in all thematic analyses, there is an element of human interpretation, which can lead to potential incongruent understandings between researchers and respondents, especially in the incorporation of latent thematic categories. A widely accepted protocol for thematic analysis was followed in order to mitigate this risk, which is detailed in the “Methodology” section. With respondents from a variety of geographical locations and backgrounds, there is also the likelihood that terms used within the survey may have nuanced interpretations. By providing an open-ended component of the survey, the authors left respondents the freedom to clarify and expand on responses as needed.
Additionally, there are also research limitations inherent in the methods of the study. The survey yielded a sample size of 84, which, although large enough to produce aggregate findings, cannot be claimed to be fully representative of the entire global population of AEC educators. Although an effort was made to connect with professional organizations with global reach within architecture, engineering, and construction, there are likely to be AEC educators who do not receive communications from any of the targeted organizations. These limitations should be considered when seeking to generalize these findings to the broader community.
The analyses presented here represent the opinions and ideas of current AEC faculty regarding a rapidly changing computational landscape in the AEC sector, which also has quickly evolving needs and challenges. The findings of this work should be considered within the lens of these evolutions.
Opportunities for future work include expanding the discussion from “what” should be taught to “how” it can be taught. For example, future work may explore innovative methods faculty have used to teach emerging technologies, such as student-led approaches as mentioned previously or methods borrowing from more agile fields like computer science and seeking to understand the efficacy of these methods. Additionally, new methods or new applications of existing teaching methodologies should be considered in the context of emerging computational needs within AEC higher education. This work described existing computing literacies in AEC curricula and the vision of current faculty for the future of AEC computational and technological education, including current barriers, but did not seek to make judgments on the pedagogical methodologies that would lend best to the instruction of this content. Therefore, future work can leverage these ideas and explore the best strategies for implementation.

Conclusion

Overall, when compared with prior related literature, the results of this work show that the topic of computing in AEC fields continues to be universally recognized for long-term student success. Some specific skills emerged that were not relevant 10 years ago, and the emphasis on other skills changed, either increasing or decreasing based on current trends and demand. However, despite changes in the competencies, reported barriers remained similar, with the notable exception of increased student demand, indicating that students are no longer a barrier to change, but perhaps now even a driver of change, with student usage of some technologies outpacing curricular coverage.
The themes that emerged from open-ended faculty responses regarding the future of AEC education recognized that the future AEC domain will rely less on humans for manual labor and repetitive or calculation-based tasks and will leverage more robotics and automation for this kind of work. Additionally, the near future holds increased opportunities for human and technology partnership for tasks that require a combination or iterations of repetitive computation/work and subjective interpretation/oversight. Human workers will likely be required to continue to do what humans do best and lead technological innovation and quickly adapt to new technologies. This change in the technological landscape will require a shift in the way educators prepare students for the workforce.
The thematic analysis also identified frequently cited factors to support educational shifts to prepare students for AEC careers of the future. Faculty cited the need for additional resources to make these educational shifts, including department/institution-led curriculum reform, better supporting infrastructure for computing, and more funding for faculty time and training.
The contribution of this work is in providing a systematic comparison of current educational practices with those a decade ago to illustrate educational shifts in our domain and adding a prediction of AEC trends from experts in AEC education. Additionally, this study identified opportunities to support a meaningful change to align AEC curricula with projected industry trends. Over the last century we have seen many technologies that have transformed AEC education and industry.
This paper would not be complete at the time it was written and submitted if we would not invite our AEC community to consider the next technology inflection point that will impact not only the AEC education and industry, but any socio-technological-economical aspects that generative AI pose. As the innovation curves become steeper and the technology innovations converge leading to exponential accelerations of transformations, we need to consider shorter time frames for future hindsight-foresight surveys to reflect on current education programs and reimagine the AEC education experiences to be developed for future generations of students.
In a holistic interpretation of the results discussed here, higher education institutions need to consider their agility to respond effectively and anticipate the challenges and opportunities created by the rapidly changing computing environment. Throughout this survey, a large variety and quantity of skills and competencies emerged, with not a single one emerging as the most important. Perhaps more important than a single skill is the ability for faculty, students, and curricula to adapt to an ever-changing technological landscape, and rather than focusing on developing specific computation tool sets or skill sets, equipping students with a computational mindset that empowers them to respond with agility to the continuously changing relationship between AEC professionals and technology.

Supplemental Materials

File (supplemental_materials_jccee5.cpeng-5646_mccorda.pdf)

Data Availability Statement

Some data, models, or code generated or used during the study are available from the corresponding author upon request, including cleaned, anonymized survey responses.

Acknowledgments

This work was funded by the American Society of Civil Engineers Computing Division Education Committee.

References

Bakker, S., and K. Niemantsverdriet. 2016. “The interaction-attention continuum: Considering various levels of human attention in interaction design.” Int. J. Des. 10 (2): 1–14.
Becerik-Gerber, B., M. Siddiqui, I. Brilakis, O. El-Anwar, N. El-Gohary, T. Mahfouz, G. Jog, S. Li, and A. Kandil. 2014. “Civil engineering grand challenges: Opportunities for data sensing, information analysis, and knowledge discovery.” J. Comput. Civ. Eng. 28 (4): 04014013. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000290.
Boles, W. W., and J. Wang. 1996. “Construction automation and robotics in civil engineering education programs.” J. Civ. Eng. Educ. 122 (1): 12–16. https://doi.org/10.1061/(ASCE)1052-3928(1996)122:1(12).
Braun, V., and V. Clarke. 2006. “Using thematic analysis in psychology.” Qual. Res. Psychol. 3 (2): 77–101. https://doi.org/10.1191/1478088706qp063oa.
Brennan, K., and M. Resnick. 2012. “Using artifact-based interviews to study the development of computational thinking in interactive media design.” In Proc., Annual American Educational Research Association Meeting. Washington, DC: American Educational Research Association.
Chong, F. T., D. Franklin, and M. Martonosi. 2017. “Programming languages and compiler design for realistic quantum hardware.” Nature 549 (7671): 180–187. https://doi.org/10.1038/nature23459.
Cooksey, J. M. 2011. “The integration of building information modeling (BIM) into civil engineering curricula–ProQuest (1505519) [Clemson University].” Clemson Univ. ProQuest Dissertations Publishing. Accessed August 1, 2023. https://www.proquest.com/openview/ba7d534b980b4407e75d7f10775d3027/1?cbl=18750&pq-origsite=gscholar.
Debs, L., B. Hubbard, and M. Zimpfer. 2022. “Teaching of emerging technology in construction education.” IOP Conf. Ser.: Earth Environ. Sci. 1101 (8): 082032. https://doi.org/10.1088/1755-1315/1101/8/082032.
Denning, P. J., and M. Tedre. 2019. Computational thinking. Cambridge, MA: MIT Press.
Dhar, V. 2013. “Data science and prediction.” Commun. ACM 56 (12): 64–73. https://doi.org/10.1145/2500499.
Fenves, S. J., and W. J. Rasdorf. 2001. “Role of ASCE in the advancement of computing in civil engineering.” J. Comput. Civ. Eng. 15 (4): 239–247. https://doi.org/10.1061/(ASCE)0887-3801(2001)15:4(239).
Forcael, E., G. Garces, and F. Orozco. 2022. “Relationship between professional competencies required by engineering students according to ABET and CDIO and teaching—Learning techniques.” IEEE Trans. Educ. 65 (1): 46–55. https://doi.org/10.1109/TE.2021.3086766.
Fruchter, R. 2018. “M3R: Transformative impacts of mixed media mixed reality collaborative environment in support of AEC global teamwork.” In Transforming engineering education through innovative computer mediated learning technologies, edited by I. Mutis, R. Fruchter, and C. Menassa. Reston, VA: ASCE.
Fruchter, R., G. Katz, and F. Grey. 2018. “From technical to tactical, and strategic BIM coordination in support of model-based decision making: Innovative computer-mediated learning technologies | Request PDF.” In Transforming engineering education: Innovative computer-mediated learning technologies. Reston, VA: ASCE.
Gerber, D. J., S. Khashe, and I. F. C. Smith. 2015. “Surveying the evolution of computing in architecture, engineering, and construction education.” J. Comput. Civ. Eng. 29 (5): 04014060. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000361.
Grajdura, S., and D. Niemeier. 2022. “State of programming and data science preparation in civil engineering undergraduate curricula.” J. Civ. Eng. Educ. 149 (2): 04022010. https://doi.org/10.1061/(ASCE)EI.2643-9115.0000076.
Hartless, J. F., S. K. Ayer, J. S. London, and W. Wu. 2020. “Comparison of building design assessment behaviors of novices in augmented- and virtual-reality environments.” J. Archit. Eng. 26 (2): 04020002. https://doi.org/10.1061/(ASCE)AE.1943-5568.0000396.
Holt, E. A., J. M. Benham, and B. F. Bigelow. 2015. “Emerging technology in the construction industry: Perceptions from construction industry professionals.” In Proc., 2015 ASEE Annual Conf. & Exposition, 26–595. Washington, DC: ASEE.
Ilbeigi, M., D. Bairaktarova, and A. Morteza. 2023. “Gamification in construction engineering education: A scoping review.” J. Civ. Eng. Educ. 149 (2): 04022012. https://doi.org/10.1061/(ASCE)EI.2643-9115.0000077.
Kalanithi, J. 2022. “Survey: The rapid rise of technology adoption on construction sites.” OpenSpace. Accessed August 1, 2023. https://www.openspace.ai/blog/technology-adoption-is-growing-on-construction-sites/.
Khashe, S., D. J. Gerber, and I. F. C. Smith. 2016. “Surveying the evolution of computing in architecture, engineering, and construction education since 2012.” J. Comput. Civ. Eng. 30 (6): 04016017. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000580.
Leite, F., Y. Cho, A. H. Behzadan, S. Lee, S. Choe, Y. Fang, R. Akhavian, and S. Hwang. 2016. “Visualization, information modeling, and simulation: Grand challenges in the construction industry.” J. Comput. Civ. Eng. 30 (6): 04016035. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000604.
McKinsey Global Institute. 2017. Reinventing construction: A route to higher productivity. New York: McKinsey & Company.
Milgram, P., and F. Kishino. 1994. “A taxonomy of mixed reality visual displays.” Accessed August 1, 2023. https://search.ieice.org/bin/summary.php?id=e77-d_12_1321.
National Research Council. 2010. Report of a workshop on the scope and nature of computational thinking. Washington, DC: National Academies Press.
OECD (Organisation for Economic Co-Operation and Development). 2023. “Employment–OECD.” Accessed August 1, 2023. https://www.oecd.org/employment/.
Polmear, M., D. R. Simmons, and N. A. Clegorne. 2020. “Undergraduate civil engineering students’ perspectives on skills for future success.” In Proc., 2020 IEEE Frontiers in Education Conf. (FIE), 1–9. New York: IEEE. https://doi.org/10.1109/FIE44824.2020.9274269.
Safapour, E., S. Kermanshachi, and P. Taneja. 2019. “A review of nontraditional teaching methods: Flipped classroom, gamification, case study, self-learning, and social media.” Educ. Sci. 9 (4): 273. https://doi.org/10.3390/educsci9040273.
Seidel, S., N. Berente, A. Lindberg, K. Lyytinen, and J. V. Nickerson. 2018. “Autonomous tools and design: A triple-loop approach to human-machine learning.” Commun. ACM 62 (1): 50–57. https://doi.org/10.1145/3210753.
Torbaghan, M. E., M. Sasidharan, I. Jefferson, and J. Watkins. 2023. “Preparing students for a digitized future.” IEEE Trans. Educ. 66 (1): 20–29. https://doi.org/10.1109/TE.2022.3174263.
Wang, L., M. Huang, X. Zhang, R. Jin, and T. Yang. 2020. “Review of BIM adoption in the higher education of AEC disciplines.” J. Civ. Eng. Educ. 146 (3): 06020001. https://doi.org/10.1061/(ASCE)EI.2643-9115.0000018.

Information & Authors

Information

Published In

Go to Journal of Computing in Civil Engineering
Journal of Computing in Civil Engineering
Volume 38Issue 3May 2024

History

Received: Aug 8, 2023
Accepted: Dec 4, 2023
Published online: Feb 20, 2024
Published in print: May 1, 2024
Discussion open until: Jul 20, 2024

Authors

Affiliations

Research Scientist, Building Systems Group, Pacific Northwest National Laboratory, 902 Battelle Blvd., Richland, WA 99354 (corresponding author). ORCID: https://orcid.org/0000-0003-1452-3948. Email: [email protected]
Associate Professor, Dept. of Civil, Environmental, and Architectural Engineering, Univ. of Colorado Boulder, Boulder, CO 80309. ORCID: https://orcid.org/0000-0002-2975-6501. Email: [email protected]
Masoud Gheisari, Ph.D., A.M.ASCE https://orcid.org/0000-0001-5568-9923
Associate Professor, Rinker School of Construction Management, Univ. of Florida, P.O. Box 115703, Gainesville, FL 32611-5703. ORCID: https://orcid.org/0000-0001-5568-9923
Yelda Turkan, Ph.D., Aff.M.ASCE https://orcid.org/0000-0002-3224-5462
Associate Professor, School of Civil and Construction Engineering, Oregon State Univ., Corvallis, OR 97331. ORCID: https://orcid.org/0000-0002-3224-5462
Ivan Mutis, Ph.D., A.M.ASCE https://orcid.org/0000-0003-2707-2701
Associate Professor, Dept. of Civil, Architectural, and Environmental Engineering, Illinois Institute of Technology, Chicago, IL 60616. ORCID: https://orcid.org/0000-0003-2707-2701
Glenn Katz, A.M.ASCE
Lecturer, Dept. of Civil and Environmental Engineering, Stanford Univ., Stanford, CA 94305-4020.
Renate Fruchter, Ph.D., M.ASCE
Director, Project Based Learning Lab, Dept. of Civil and Environmental Engineering, Stanford Univ., Stanford, CA 94305-4020.

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

View Options

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share