Francesco Fornetti
IMAGE LICENSED BY INGRAM PUBLISHING
While the pandemic acted as a catalyst for the adoption of new assessment methods, the need for innovation in this area had long been overdue. With powerful simulation tools and easily accessible web-based resources now widely available, an opportunity emerges for a paradigm shift in assessing electronic engineering subjects.
The driving force behind the work presented in this article was the creation of a novel assessment format that fosters students’ intrinsic motivation to develop a “functioning” knowledge applicable to real-world problem solving. An additional aim was to enhance the efficiency and accuracy of grading for instructors, reducing the time burden while ensuring more precise evaluation.
Usually, RF engineering subjects are assessed either through coursework or a traditional, pen-and-paper, closed-book exam. At the University of Bristol, turning the latter into an open-book online assessment was a difficult challenge in 2020/2021 and 2021/2022 and yielded suboptimal results.
While some institutions employed systems such as proctoring to supervise web-based exams, at the University of Bristol, such systems would have been costly and impractical to deploy on a large scale. Instructors were therefore made aware of the fact that online exams would be open book and unsupervised and hence asked to design their exams accordingly.
Several formats were made available for online examinations, but it was quickly determined that the one that would allow the fairest assessment was a web-based test with partial manual marking.
This meant that up to 80% of the questions would be marked automatically, while 20% or more would be marked manually. This ensured that partial credit could be awarded by grading students’ workings, which they could upload at the end of the test. This format significantly increased students’ satisfaction with online exams and helped combat plagiarism.
To try to further avoid plagiarism and collaboration, several other techniques were also employed, the most effective of which were the following:
It is important to note that there is an additional mechanism that students could employ to gain an “unfair” advantage in answering some of the questions, not available in a traditional exam: simulation software. However, there are effective ways to design questions to make it harder for students to employ a simulator to answer them without carrying out any calculations. For example, one may ask for the value of a resistor within a circuit that will yield a specific current in a branch, instead of simply providing the values of all of the circuit elements and asking for the current in that branch. The simulator then becomes a means of verifying answers instead of a tool that will produce them in the first instance.
At the author’s institution, electronic engineering students are introduced to a powerful simulation tool, the Cadence AWR Design Environment (AWR DE), from the first week of their degree and are provided with bespoke video tutorials to learn how to use it to analyze a broad range of circuits, from the simplest to the most complex. This approach begs the question: Is it fair to use the aforementioned ploys in online exams to prevent students from using simulation software to answer questions?
It is the author’s view that it is crucial to suitably ascertain that students in their first and second year of study have developed a sufficient understanding of circuit analysis, and hence questions that cannot be easily solved using simulation are indeed appropriate. Simulation can still be used as an aid, which may help students feel more confident about their answers and perform better, without masking significant knowledge gaps.
However, upon taking over the RF Engineering module in the 2022/2023 academic year, delivered to fourth-year undergraduate and postgraduate (MSc) students, the author asked himself: At this stage, when basic circuit analysis skills have been assessed and where circuit design is hardly ever carried out on paper, does it make sense to design an assessment for which simulation is still just an accessory? Should it not be the centerpiece instead?
The author, therefore, decided to try out a new type of simulation-based assessment and tap into the potential of open-book, online exams to evaluate the students’ ability to put their knowledge to work—to make it function.
It is important to emphasize that this work was not carried out to bypass the limitations imposed by the COVID-19 pandemic. Instead, its primary purpose was to develop a new assessment paradigm aimed at enhancing the authenticity of RF engineering assessments. It is worth noting that all data presented in this article were collected during the 2022/2023 academic year, after all restrictions had been lifted, and full campus accessibility was restored. This work, therefore, has long-term value, and it will be utilized in our future endeavors.
Teaching and assessment methods should not be seen as separate entities; therefore, the author endeavored to employ the principles of constructive alignment [1] to redesign not just the exam, but the entire module.
The “constructive” aspect refers to the idea that students construct meaning through relevant learning activities. That is, meaning is not something imparted or transmitted from teacher to learner but is something learners have to create for themselves. Teaching effectively becomes just a catalyst for learning [1]. What the student does is actually more important in determining what is learned than what the teacher does [2].
The “alignment” aspect refers to what the teacher does, which is to set up a learning environment that supports the learning activities appropriate to achieving the desired learning outcomes. The key is that the components in the teaching system, especially the teaching methods used and the assessment tasks, are aligned with the learning activities. This way, the learner is in a sense “trapped” and finds it difficult to escape without learning what he or she is intending to learn [1].
A flipped-classroom structure greatly supports the achievement of an aligned curriculum since it entails the provision of video lectures and practice problems that students can work through before participating in interactive, problem-solving activities in the classroom [3], [4]. The approach has been employed by the author in his first-year Linear Circuits and Electronics module since 2020, and it has been very popular since it enabled students to work at their own pace and also use AWR DE as a learning aid, thanks to the provision of bespoke video tutorials [5].
Since the author created a freely available multimedia textbook in 2013, Conquer Radio Frequency (CRF), which features more than 12 h of problem-based video tutorials based on simulations [6], this delivery method appeared to be ideally suited to the RF Engineering unit. However, while CRF provided a high baseline, additional material was needed; hence, problem sets with detailed worked solutions, 2D and 3D transmission line animations [7], and additional simulation challenges were also created.
All of the material was provided on Blackboard, the university’s virtual learning environment (VLE); however, most VLEs offer similar features. This made it possible to create a consistent structure for each topic by means of “learning modules,” each comprising five sections:
Each week an e-mail was sent to students specifying which material they were expected to go through before the scheduled sessions on campus. These sessions were designed to engender critical thinking and were aligned with the planned assessment, thus enabling students to practice with typical exam questions. They also enabled the author to gain further insight into the students’ abilities and the time that an average student would need to answer different types of questions. This helped him “pitch” the assessment at the right level and determine an appropriate difficulty level for a 2-h online exam.
While most students enjoyed this delivery structure, as confirmed by the midmodule feedback, some expressed concerns about the novel exam format, which would require the use of simulation. Indeed, upon running a survey in week 3 on whether they would prefer a traditional online exam, including an upload of handwritten workings, or one that would require them to use a simulation tool to answer the majority of the questions, the class was split 50/50. This, however, led the instructor to suspect that some were opting for the traditional exam for fear of the unknown. He, therefore, ran two sessions during which students were asked to solve problems, such as impedance matching using the Smith chart, both on paper and with the simulation tool. These sessions made them realize how much more laborious and error prone producing pen and paper answers could be, and upon polling them thereafter, 80% of the cohort was in favor of simulation-based exams.
One of the main advantages of simulation-based, online exams is the ability to assess a much broader range of topics than is possible with a standard closed-book exam in a relatively short time window (2 h + 30 min for the upload of workings). For example, carrying out a three-element impedance match on a paper Smith chart with dual coordinates could easily take the average student 30 min—a quarter of the entire exam time—and this would severely limit the range of material that could be examined. In addition, expecting students to answer a question of this type on paper is not reflective of how it is done in real life and is somewhat anachronistic. Furthermore, expecting them to take a clear enough photograph of a paper Smith chart with their workings on it and upload it as a pdf is not very realistic.
The format chosen by the author instead was more closely linked to the design process that students would carry out in real life, which is something that they noted and praised in their feedback. Furthermore, it made it possible to evaluate the skills that students had developed in all of the topics that had been taught and check their “functioning knowledge” instead of their declarative knowledge [1]. This broadened the remit of the assessment and made it much more valuable.
The exam itself comprised 10 questions, five on passive and five on active RF circuits, which evaluated the students’ understanding of the following topics.
Passive circuits:
Q1. Effects of a transmission line segment of a given electrical length, inserted between a test port and a generic complex impedance, on the reflection coefficient and impedance seen by the port.
Q2. Converting series RL and RC networks into their shunt equivalents and vice versa.
Q3. (UP) Designing matching networks between resistive source and load impedances employing
quarter-wave transformers and
L-sections, with either low-pass or high-pass frequency responses, using discrete elements.
Q4. (UP) Transforming a resistive impedance into a complex one using L-sections comprising only distributed elements.
Q5. (UP) Designing a three-element matching network (Pi or T) for a specific Q.
Active circuits:
Q6. Given a physical bipolar junction transistor model, use its IV char acteristics to design a bias network for a given dc supply and collector current.
Q7. Given the two-port S-parameters of a transistor at a specific frequency, as a table, create an s2p file within AWR DE to determine its stability and design a resistive network that ensures unconditional stability over a given frequency range.
Q8. (UP) Given the two-port S-parameters of an unconditionally stable transistor at a specific frequency, as a table, create an s2p file within AWR DE, and
Q9. Identify the correct steps of amplifier design using either operating gain circles or available gain circles (multiple answer question).
Q10. Determine the output power in watts or milliwatts of a 50-Ω matched RF system, given the input power in decibels per milliwatt and the S21 of each block, and provide the peak-to-peak voltage amplitude corresponding to such a power level.
The questions that required students to upload their workings are identified by “(UP).” These uploads entailed copying schematics and Smith charts from the simulator into a Word file and adding a handful of annotations on the charts, which could be easily done using a mouse, saving the file as a pdf, and uploading it. Students were given clear instructions on how to do this, ample opportunity to practice, and sample pdfs that unequivocally set the expectation for the content and format of the uploads. Furthermore, 30 additional min were given to create the pdf and upload it.
For each question type, several versions were created and grouped into pools, one for each question type, from which Blackboard would randomly draw each question for each student. This made it highly unlikely that two students would be presented with the same set of questions, thus reducing the opportunity to cheat. The use of simulation made the creation of such pools relatively straightforward and also made it possible to create clear solutions against which the instructor could grade the students’ answers.
Furthermore, five of the 10 questions in the exam did not possess a singular right or wrong answer. For example, the realization of an L-section matching network (Q3, Q4, Q8) could be achieved through several different topologies, prompting students to reflect on their chosen network type (e.g., low-pass, high-pass) and its feasibility based on component values. Similarly, for three-element networks (Q5), students were given the choice between a Pi or T configuration, each utilizing different component types. Additionally, for Q7, one of four different stabilization networks could be employed in the answer. These aspects allowed for a range of valid solutions. In this context, the use of simulation tools proved highly advantageous for grading, enabling easier evaluation of the diverse solutions compared to traditional paper-based assessments.
It would be very challenging to assess this range of skills in a 2-h, closed-book, paper exam! Also, while similar and considerably more complex versions of these questions could be set as coursework, this would make it hard to ascertain that the submission is indeed the student’s own work and would significantly increase the marking workload.
Since this assessment format was very novel at the author’s institution, it was subjected to broad scrutiny, and indeed three different mechanisms were employed to evaluate its effectiveness.
Figure 1. RF Engineering exam: grade distribution.
The exam was taken by 59 students, of which 84% were graduate students reading for an MSc degree and the remaining students were fourth-year undergraduate students.
The pass mark for this module was set at 50%, which meant that the failure rate was 18%. This is within the usual parameters of a typical module taught at this level.
Universities in the United Kingdom use a compressed marking scale, where 50%–59% is a lower second class, 60%–69% an upper second class, and 70% or above a first class. For this exam, the percentage of first-class grades was 53%, which is higher than usual. The author recognizes that, considering the novelty of the exam and the absence of past exam papers apart from the mock exam that students could attempt up to two days before the actual exam, he may have played it safe with the difficulty level. Nonetheless, the average (65%), standard deviation (26.5), and grade distribution (Figure 1) confirm that the assessment was successful in distinguishing between different levels of competence.
Feedback questionnaires are automatically sent to students at the end of the term and before the exam. The response rate for this module was 16%, and the results are summarized in Table 1. They confirm that the new unit structure and assessment were successfully aligned and enabled students to engage in high-level cognitive processes.
An additional survey was sent to students after the exams to gain further insight into their experience of the novel type of examination. The response rate was 54%.
The answers about the exam format, summarized in Table 2, show that students felt that this type of assessment was an effective way of assessing their skills and enabled them to demonstrate their “functioning knowledge” of the material.
The feedback summarized in Table 3 further confirms that students found the simulation tool easy to use and that it made answering questions easier than on paper. It also shows that students felt that the use of simulation made it possible to evaluate skills that are relevant to an industrial environment.
Furthermore, 58% of students strongly agreed and 29% agree that they would recommend this exam format to other students or instructors.
Following is a selection of answers to the open question: What do you think are the strengths of this exam format compared to traditional exams?
“This exam format tests theoretical knowledge as well as practical skills and gets rid of fussy calculations.”
“Such tests are more closely related to the practical application of knowledge and test students’ use of software in the industry”
“[the exam] tests the appropriate knowledge (I think in-person exams become a memory exercise so I thought it was beneficial to be examined on using a practical tool that can be applied to real-life scenarios)”
“Everything was clear and convenient, making me feel less nervous”
Students were also asked about what could be improved, and while the majority did not provide an answer, the few who did mostly complained about the upload process on Blackboard itself, calling it “clunky.” A small number of students also asked for more time; however, based on the results, it would appear that the time given to complete the exam was adequate.
The new exam format in which the questions were designed to be answered using a simulation tool brought about several advantages for both instructors and students.
It enabled the instructor to assess a much broader range of topics than would have been possible through a closed-book, pen-and-paper exam and to do so in a much more authentic way, one that is considerably more aligned with industrial RF design activities.
More than 86% of students agreed that the new exam format enabled a fair and accurate assessment of their knowledge and skills, and 74% agreed that it made answering questions easier than on paper.
Since this type of assessment was new to all students on the course, it was crucial to ensure that they were given ample opportunity to understand the expectations and prepare adequately for the exam. Indeed, the use of a flipped-classroom approach and activities within it that were appropriately aligned with the exam made 88% of students feel well prepared and less anxious and, in the author’s view, was the key to the success of this new assessment method.
Ensuring that all students had access to the simulation tool, Cadence AWR DE, was also key. While Windows users could install the software on their own machines free of charge through the AWR University program [8], those running MAC OS were not able to do so. Remote desktop access was therefore set up to enable those with non-Windows operating systems to log into a virtual Windows machine to access the tool. The software was also made available on all PCs in the faculty.
The methodology employed does not require the specific software suite mentioned in this article to be successfully deployed. Other simulation tools with similar capabilities could be employed instead, but providing students with bespoke video tutorials that are highly relevant to the material that they are taught and assessed on would be essential. Furthermore, the assessment method could easily be extended to several other subjects in electronic engineering.
This work exemplifies the compelling need to devise new assessment strategies, not solely driven by logistical or access restrictions, but rather fueled by the desire to infuse assessments with greater meaning and authenticity. Technological advancements should be perceived as enabling tools that facilitate design and creation, rather than as adversaries to be strenuously fought against to preserve the bastion of archetypal assessment from being dismantled.
[1] J. Biggs, “Aligning teaching for constructing learning,” Higher Educ. Acad., vol. 1, no. 4, pp. 1–4, Jan. 2003.
[2] T. J. Shuell, “Cognitive conceptions of learning,” Rev. Educ. Res., vol. 56, no. 4, pp. 411–436, Winter 1986, doi: 10.3102/00346543056004411.
[3] J. Bishop and M. A. Verleger, “The flipped classroom: A survey of the research,” in Proc. ASEE Annu. Conf. Expo., Atlanta, GA, USA, 2013, pp. 23.1200.1–23.1200.18, doi: 10.18260/1-2.
[4] K. Missildine, R. Fountain, L. Summers, and K. Gosselin, “Flipping the classroom to improve student performance and satisfaction,” J. Nurs. Educ., vol. 52, no. 10, pp. 597–599, Sep. 2013, doi: 10.3928/01484834-20130919-03.
[5] Explore RF. YouTube channel. Available: https://www.youtube.com/user/RFMicrowave The channel was opened on 31 Jan 2011 and the latest video was uploaded on 28 July 2023. Lat accessed 15 Aug 2023.
[6] F. Fornetti. “Conquer radio frequency — Multimedia Textbook.” Explore RF. [Online]. Available: https://explorerf.com/conquer-radio-frequency.html Last accessed 15 Aug 2023.
[7] F. Fornetti. “Transmission line animations.” Explore RF. Accessed: Aug. 15, 2023. [Online]. Available: https://explorerf.com/transmission-line-animations.html
[8] “University program,” AWR Corp., El Segundo, CA, USA, 2023. [Online]. Available: https://awrcorp.com/register/customer.aspx?univ