Skip to main content

Big Data

Name Investigator Tech ID Licensing Manager Name Micensing Manager Email Description Tags
High Frequency Pulsed Microjet Actuation William Oates 10-045 Brittany Ferraro <p>Flow control theory and actuator development have been the subject of intense research for more than a decade for applications on various aircraft structures including fixed wings, cavity flow, rotor blades, and impinging jets.</p> <p>The present invention comprises a piezoelectric actuator for varying the throat geometry of a microjet nozzle, thereby varying the characteristics of the microjet produced by the microjet nozzle. The inventive device is capable of producing pulsed flow and also rapidly variable flow in order to provide active control. To our knowledge, most of the actuators that are presently available suffer from a limited dynamic range, insufficient control authority, very high mass flow, complexity, size/weight and/or robustness. Piezoelectric materials, in particular, are used in stack actuator configurations for high bandwidth nanoprecision control applications which makes it ideal for actively throttling an array of microjets. The direct conversion of electrical energy to mechanical energy provides unique capabilities when coupled to an actively deforming supersonic nozzle.</p> <h2>Applications:</h2> <ul> <li>The proposed actuator can be incorporated into a wide variety of known flow control systems and because it is so responsive in the frequency domain, an active control system (using feed-back and/or feed-forward control loops) can be used. The control system can even vary the frequency in real time in order to disrupt unwanted isolations in the flow.</li> <li>This invention should be of interest to a whole array of aerospace and aviation stakeholders who are actively pursuing Active Flow Control for the current and Next Generation of air and space vehicles. These include commercial aircraft manufacturers, e.g. Boeing, Airbus and Gulfstream, the US Dept of Defense (DARPA, Air Force, Navy and the Army) Military and NASA.</li> </ul>
Intelligent Wi-Fi Packet Relay Protocol Dr. Zhenghao Zhang 13-089 Michael Tentnowski <p>L2Relay is a novel packet relay protocol for Wi-Fi networks that can improve the performance and extend the range of the network. A device running L2Relay is referred to as a relayer, which overhears the packet transmissions and retransmits a packet on behalf of the Access Point (AP) or the node if no acknowledgement is overheard. L2Relay is ubiquitously compatible with all Wi-Fi devices. L2Relay is designed to be a layer 2 solution that has direct control over many layer 2 functionalities such as carrier sense. Unique problems are solved in the design of L2Relay including link measurement, rate adaptation, and relayer selection. L2Relay was implemented in the OpenFWWF platform and compared against the baseline without a relayer as well as a commercial Wi-Fi range extender. The results show that L2Relay outperforms both compared schemes.</p>
System and Method of Probabilistic Passwords Cracking Dr. Sudhir Aggarwal 11-189 Michael Tentnowski <p>Professor Aggarwal and his team have created a  system and method of probabilistic passwords cracking.</p> <p>This technology is a novel password cracking system that generates password structures in highest probability order. Our program, called UnLock, automatically creates a probabilistic context-free grammar (CFG) based upon a training set of previously disclosed passwords.</p> <p>This CFG then allows to generate word-mangling rules, and from them, password guesses to be used in password cracking attacks.</p> <h2>Advantages:</h2> <ul> <li>Effectiveness demonstrated on real password sets</li> <li>Technology capable of cracking significantly more passwords in the same number of guesses as compared to publicly available standard password cracking systems.</li> <li>Tested in digital forensic missions</li> </ul>
System and Methods for Analyzing and Modifying Passwords Dr. Sudhir Aggarwal 12-044 Michael Tentnowski <p>Professor Aggarwal's team developed a system for analyzing and modifying passwords in a manner that provides a user with a strong and usable/memorable password. The user would propose a password that has relevance and can be remembered. The invention would evaluate the password to ascertain its strength. The evaluation is based on a probabilistic password cracking system that is trained on sets of revealed passwords and that can generate password guesses in highest probability order. If the user's proposed password is strong enough, the proposed password is accepted. If the user's proposed password is not strong enough, the system will reject it. If the proposed password is rejected, the system modifies the password and suggests one or more stronger passwords. The modified passwords would have limited modifications to the proposed password. Thus, the user has a tested strong and memorable password.</p>
Fast Dynamic Parallel Approximate Neighbor Search Data Structure Using Space Filling Curves Piyush Kumar 16-096 Michael Tentnowski <p>The nearest neighbor search (NNS) is a technique that is used in computing to optimize the amount of time it takes to accurately locate one data point in relation to another data point in a dataset that is organized so that distances between points are measured in Euclidian space. Increasingly, NNS computation is becoming a key sub-task in many algorithms and applications that are used to process, organize, cluster, learn, and understand massive data sets, such as those used in the automotive, aerospace, and geographic information system (GIS) industries.</p> <h2>The Problem:</h2> <p>The NNS algorithm works well for small data sets, but it is too time-consuming to implement with large data sets. The approximate nearest neighbor (ANN) search, an alternative to NNS, improves search time and saves memory by estimating the nearest neighbor, without guaranteeing that the actual nearest neighbor will be returned in every case. Two limitations of this method are that it is difficult to make an ANN algorithm dynamic (i.e., allows for insertions and deletions in the data structure) or to parallelize it (i.e., use multiple processors to speed up queries).</p> <h2>The Solution:</h2> <p>Dr. Kumar and his research team are developing a novel, practical, and theoretically-sound method that will solve the NNS problem in lower dimensional spaces. Specifically, the researchers are creating an approximate k-nearest neighbor algorithm, based on Morton Sorting of points, to create a software library for approximate nearest neighbor searches for Euclidian spaces. The library will use multi-core machines efficiently (parallel) and enable the insertion and deletion of points at run time (dynamic). This new algorithm delivers the search results with expected logarithmic query times that are competitive with or exceed Mount’s approximate nearest neighbor (ANN) search.</p> <h2>Advantages:</h2> <ul> <li>Speed on multicore machines</li> <li><span>Minimum spanning tree computation</span></li> </ul>
Materials Genome Software to Accelerate Discovery of New Materials Jose Mendoza-Cortes 18-012 Garrett Edmunds <p>The creation of a material genome can accelerate the discovery of new materials in much the same way the human genome is accelerating advances in gene therapy. It often takes 15-20 years to transfer advanced materials from the laboratory to the marketplace. Our predictive software utilizes unique databases of predicted materials to drastically accelerate the discovery of new materials by allowing users in research and industry to synthesize and characterize only the most promising compounds for the desired application in lieu of experimental trial and error on thousands candidates or even more. Genomes and predictive algorithms for energy storage and light capture materials have been developed. This technology is primed to be commercialized as a software as a service (SaaS).</p>
Photosynthetic Transcription Factors that Determine Bundle Cell Fate and Function Hongchang Cui 13-087 Brent Edington <p>Photosynthesis is one of the most important reactions on the earth because its products are the ultimate energy source for all living organisms and the food of the human beings. Depending on the number of carbon atoms in the initial organic compound made in photosynthesis, plants can be grouped into C3 or C4 plants. C4 plants are evolved from C3 plants, but they have several features that make them much more efficient photosynthetically than C3 plants:</p> <ol> <li>The enzyme for C02 fixation, called PEP carboxylase, is not inhibited by oxygen. In contrast, the C02 fixation enzyme in C3 plants, called RUBICO carboxylase, has an oxygenase activity, which reverses the photosynthetic reaction. The oxygenase activity is favored at high light and high temperature, making C3 plants perform even worse in warm area where crop yield potential is high.</li> <li>C02 fixation occurs in the mesophyll cells, whereas C3-type photosynthesis is performed in the bundle sheath (BS) cells, which surround the vascular tissue, using the C02 concentrated by and supplied from mesophyll cells.</li> <li>There are more vascular bundles (veins), hence more BS cells, and a greater number of channels between BS and mesophyll cells, which ensures rapid transport between these two cell types. C4 plants are also efficient in water utilization. Because many important crops are C3 plants, such as rice and wheats, huge resources have been invested to introduce C4 photosynthesis into C3 plants. Although BS cells are also present in C3 plants, they generally contribute little to photosynthesis and this cell type has therefore become a primary target for C3-to-C4 bioengineering. Despite extensive research, until now the mechanism that controls BS cell fate is still unknown.</li> </ol> <p>Our work identified three transcription factors (SCR, SCL23 and SHR) that are required for BS cell fate specification in Arabidopsis, a model C3 plant. SCR and SCL23 are both expressed specifically in BS cells. Though they act redundantly in determining the BS cell fate, they have distinct functions. Because similar genes are present in other plant species, including rice and maize, which is a C4 plant, we believe that similar mechanisms control BSC cell fate determination in all C3 and C4 plants. These bundle sheath cell determinants offer a novel and powerful tool for the C3-to-C4 engineering, which is regarded as a key solution to the demand for food and biomass by a rapidly growing world population.</p>
Systems and Methods for Improving Processor Efficiency Dr. David Whalley 13-101 Michael Tentnowski <p>Dr. Whalley's team has created  a data cache systems designed to enhance energy efficiency and performance of computing systems. A data filter cache herein may be designed to store a portion of data stored in a level one (L1) data cache. The data filter cache may reside between the L1 data cache and a register file in the primary compute unit. The data filter cache may therefore be accessed before the L1 data cache when a request for data is received and processed. Upon a data filter cache hit, access to the L1 data cache may be avoided. The smaller data filter cache may therefore be accessed earlier in the pipeline than the larger L1 data cache to promote improved energy utilization and performance. The data filter cache may also be accessed speculatively based on various conditions to increase the chances of having a data filter cache hit.</p> <p>Furthermore,  tagless access buffers (TABs) can optimize energy efficiency in various computing systems. Candidate memory references in an L1 data cache may be identified and stored in the TAB. Various techniques may be implemented for identifying the candidate references and allocating the references into the TAB. Groups of memory references may also be allocate to a single TAB entry or may be allocated to an extra TAB entry (such that two lines in the TAB may be used to store L1 data cache lines), for example, when a strided access pattern spans two consecutive L1 data cache lines. Certain other embodiments are related to data filter cache and multi-issue tagless hit instruction cache (TH-IC) techniques.</p>
Sub-seasonal Forecasts of Winter Storms and Cold Air Outbreaks Dr. Ming Cai 16-090 Michael Tentnowski <p style="font-size: 18px;" class="font_8"> </p> <p class="lead">"Our technology is a dynamics-statistics hybrid model to forecast continental-scale cold air outbreaks 20-50 days in advance beyond the 2-week limit of predictability for weather."</p> <p style="font-size: 18px;" class="font_8"> </p> <p style="font-size: 18px;" class="font_8"><span style="font-size: 18px;">Professor Cai's team has developed a technology that allows them to make Sub-seasonal forecasts for cold air outbreaks in winter season. These forecasts are made on the basis of the relationship of the atmospheric mass circulation intensity and cold air outbreaks. The atmospheric poleward mass circulation aloft into the polar region, including the stratospheric component, is coupled with the equatorward mass circulation out of the polar region in the lower troposphere. The strengthening of the later is responsible for cold air outbreaks in mid-latitudes.</span></p> <p style="font-size: 18px;" class="font_8"><span style="font-size: 18px;">Due to the inherent predictability limit of 1-2 weeks for numerical weather forecasts, operational numerical weather forecast models no longer have useful forecast skill for weather forecasts beyond a lead time of about 10 days. Recently, the research carried out by Professor Cai and his team shows that operational numerical weather forecast models do possess useful skill for atmospheric anomalies over the polar stratosphere in cold seasons owing the models' ability to capture the poleward mass circulation into the polar stratosphere.</span></p> <p style="font-size: 18px;" class="font_8"><span style="font-size: 18px;">They calculate the stratospheric mass transport into the polar region from forecast outputs of the US NOAA NCEP's operational CFSv2 model and use it as our forecasts for the strength of the atmospheric mass circulation. The anomalous strengthening of it is indicative of the high probability of occurrence of cold air outbreaks in mid-latitudes.They further derive a set of forecasted indices describing a state of stratospheric mass circulation to obtain detailed spatial pattern and intensity of the associated cold air outbreak events. </span></p> <p style="font-size: 18px;" class="font_8"><span style="font-size: 18px;">Because cold air outbreak events are accompanied with development of low and high pressure systems and frontal circulations, our forecasts of cold air outbreaks are also indicative of snow, frozen rain, high wind, icy/freezing and other winter storm related hazards besides a large area of below-normal cold temperatures.</span></p> <p><a href="">Forecast website:</a></p> <p><a href="">Professor Cai in the news</a></p> <p> </p>
Fingerprint for Cell Identity and Pluripotency David Gilbert 12-028 Brent Edington <p>At Florida State University, we have developed a method to identify sets of regions that replicate at unique times in any given cell type (replication timing fingerprints) using pluripotent stem cells as an example, and show that genes in the pluripotency fingerprint belong to a class previously shown to be resistant to reprogramming in induced pluripotent stem cells (iPSCs), identifying potential new target genes for more efficient iPSC production. We propose that the order in which DNA is replicated (replication timing) provides a novel means for classifying cell types, and can reveal cell type specific features of genome organization.</p> <p>A major advantage of our fingerprinting method is in selection of a minimal set of regions that allow for classification with a straightforward PCR-based timing assay and a reasonably small set of primers, particularly if only cell-type specific regions are examined. Our results suggest that a standard set of 20 fingerprint loci can be effective for classification, but the number of regions queried can be adjusted based on the confidence level required. The sole requirement for replication profiling is the collection of a sufficient number of proliferating cells for sorting on a flow cytometer. Consistently, just as replication fingerprints can be generated for particular cell types or general categories of cells, features of replication profiles allow for the creation of disease-specific fingerprints, which may be valuable for prognosis. We have also identified regions that may undergo important organizational changes upon differentiation.</p>
System and method for Generating a Benchmark Dataset for Real noise Reduction Evaluation Dr. Adrian Barbu 15-046 Garrett Edmunds <p>Often images taken with smartphones or point-and-shoot digital cameras come out noisy due to lack of sufficient lighting. This low-light noise problem is widespread, being present in all of the smartphones in the world, more than 1 billion total. This problem generates in consumers disappointment and frustration with the quality of the images taken in low light. While a number of commercial denoising packages are already available on the market, the majority of them are trained on images corrupted by artificial noise, rather than trained on real low-light noisy images. Since artificial noise has different characteristics than real noise, these packages do not perform as well as a denoising algorithm trained on images corrupted by real noise as we have created.</p> <p>                We have developed a fully automatic state of the art algorithm (RENOIR) for denoising smartphone and digital camera images which have low-light noise problems. The RENOIR algorithm could be either; be sold directly to the public as a standalone application or could be licensed to smartphone or digital camera manufacturers to be embedded in their devices.</p> <p> </p>
A Robust Method to Measure the Temporal Order of Replication of all Chromosome Segments in a Cell Dr. Gilbert 12-102 Brent Edington <p>Cancers have unique replication timing fingerprints that hold great promise as a novel genre of biomarkers, and despite the heterogeneity in different individual cancers, each cancer more closely resemble their tissue of origin than they do other tissue types. This phenomenon demonstrates the great promise of replication timing profiling to determine tissue of origin for metastatic cancers.</p> <p><span style="font-family: inherit; font-size: 0.875rem; line-height: 1.4;"></span>There are many biomarkers on the market for cancer and most involve tests for chromosome abnormalities while some involve gene expression tests. Currently biomarkers are only partially effective at diagnosis. Our technology provides a completely novel genre of biomarkers that cannot be detected by any other existing method.</p> <p>This technology can provide a completely novel type of tissue of origin test, and queries the entire genome simultaneously and therefore is more comprehensive.</p>
Lipid Multi-Layer Gratings for Semi-Synthetic Quorum Sensors Dr. Lenhert 11-067 Brent Edington <p>The present invention provides a device comprising: a substrate, and a quorum sensor array on the substrate. The quorum sensor array comprises quorum sensors releasing signal molecules in response to one or more environmental signals being sense by the quorum sensors to thereby amplify the one or more environmental signals by causing a signal chain reaction in neighboring quorum sensors of the quorum sensor array, and wherein each of the quorum sensors comprises a lipid multi-layer structure.</p> <p>The present invention provides a method comprising the following steps: (a) detecting with a camera one or more light intensities of light scattered by one or more iridescent microstructures of a sample, and (b) determining a height of each of the one or more iridescent microstructures of the sample based on one or more light intensities detected in step (a) and a calibration profile for the camera, wherein the calibration profile is based on light intensities detected by the camera for light scattered one or more patterned arrays of standard iridescent microstructures of a calibration standard, and wherein each of the patterned arrays of iridescent microstructures comprises iridescent microstructures having the same shape and two or more different heights.</p>
Pressure Sensors including an Ionic Conduction Sensing Mechanism Dr. Liu 08-132 Brittany Ferraro <p>The present invention describes thin film sensors for detecting the presence, intensity, and/or location of a compressive force, or pressure based on ionic conduction variation as the sensing principle. Upon wisely choosing soft materials-- elastomer-like polymer and polymeric gel electrolytes/polymer electrolytes in combination with appropriate patterning, the present invention offers low pressure level sensing and mapping capability with enhanced sensitivity. The sensor includes a plurality of conducting elements spaced apart from each other and at least one deformable electrolyte bridge contacting each of the conducting elements at one or more contact points having an aggregate contact area. Upon formation of an ionic circuit between two of the conducting elements, a first resistivity between the two conducting element exists. Upon application of a compressive force on the at least one deformable electrolyte bridge directed toward at least one of the conducting elements, the aggregate contact area increases such that a second resistivity between the two conducting elements exists. The difference between the first and second resistivity can be correlated with the pressure or mechanical displacement to be measured.</p> <h2>Applications:</h2> <ul> <li>This invention has numerous potential application in pressure sensing and mapping, e.g., seat occupancy detection for the automobile industry, tactile feedback for robots to sense and respond to environments, rehabilitation progress monitoring of a patient for the medical industry, biting force mapping in dentistry application, or measuring force on golf club grips.</li> </ul>
Liposome Micro- and Nano-Arrays for Molecular Screens in Cell Culture Dr. Lenhert 11-191 Brent Edington <p>The proposed invention describes the use of surface supported liposome arrays as a platform for screening of molecular libraries in cell culture models. Drug candidates encapsulated into surface supported liposomes are arrayed on a surface to form lipid multilayer arrays. The surface has been functionalized to ensure liposome uptake by the cells. Cells are cultured on these arrays and their response to the liposomes are monitored optically. Multiple liposome compositions and different lipids or other additives printed onto the same surface can be simultaneously screened. The drugs that are and are not working can be determined by their position on the surface.</p> <p>Contrarily to actual small molecule microarrays for drug screening strategies our invention does not require to covalently attach to a surface, and cells can be grown on the surface. Covalent attachment of the small molecule on the surface prevents internalization of the compounds, limiting the types of tests that can be carried out. Furthermore, the number of molecules that a single cell can see is limited by the surface it contacts. Diffusion of small molecules from array sources, such as gels has also been used for screening, although molecular diffusion limits applicability of those methods. Using surface supported lipid multilayers encapsulating drug candidates solves these problems.</p> <h2>Applications:</h2> <ul> <li>Screening of delivery systems, particularly for lipophilic drug candidates</li> <li>Drug resistance cell screening, where cells from biopsies are cultured ex situ</li> </ul>
The Lookahead Instruction Fetch Engine (LIFE) Dr. David Whalley 08-033 Michael Tentnowski <p>The Lookahead Instruction Fetch Engine (LIFE) provides a mechanism to guarantee instruction fetch behavior in order to avoid access to fetch-associated structures, including the level one instruction cache (Ll IC), instruction translation look aside buffer (ITLB), branch predictor (BP), branch target buffer (BTB), and return address stack (RAS). Systems and methods may be provided for lookahead instruction fetching for processors. The systems and methods may include an L1 instruction cache, where the L1 instruction cache may include a plurality of lines of data, where each line of data may include one or more instructions. The systems and methods may also include a tagless hit instruction cache, where the tagless hit instruction cache may store a subset of the lines of data in the L1 instruction cache, where instructions in the lines of data stored in the tagless hit instruction cache may be stored with metadata indicative of whether a next instruction is guaranteed to reside in the tagless hit instruction cache, where an instruction fetcher may be arranged to have direct access to the L1 instruction cache and the tagless hit instruction cache, and where the tagless hit instruction cache may be arranged to have direct access to the L1 instruction cache.</p> <p>LIFE can both reduce energy consumption and power requirements with no or negligible impact on application execution times. It can be used to reduce energy consumption in embedded processors to extend battery life. It can be used to decrease power requirements of general purpose processors to help address heat issues. LIFE, unlike most energy saving features, does not come at the cost of increased execution time. It will result in a significant improvement over the state of the art and will extend the life of batteries making mobile computing more practical. Finally, it will allow general-purpose processors to run at a faster clock rate with similar heat being generated.</p>
Method for Real-Time Probabilistic Inference with Bayesian Network on GPGPU Devices Robert Van Engelen 17-013 Michael Tentnowski <p>General Purpose Graphics Processing Unit (GPGPU) devices are used in most PCs for graphics, popular for high-performance computing, and relatively inexpensive. However, algorithms must be specifically designed for these devices.</p> <p>The proposed invention consists of a process and a method for efficient probabilistic inference with Bayesian probabilistic networks on GPGPU devices. Bayesian probabilistic networks are widely used for modeling probability beliefs in computational biology and bioinformatics, healthcare, document classification, information retrieval, data fusion, decision support systems, security and law enforcement, betting/gaming and risk analysis.</p> <p> The invention consists of:</p> <ol> <li>A novel “parallel irregular wavefront process” for importance sampling with Bayesian probabilistic networks, such that this process is tailored to the specific FPFPU device being used</li> <li>A novel method to structure the Bayesian probabilistic network in GPGPU local memories to ensure optimal data access.</li> </ol> <p>This invention increases the efficiency of probabilistic inference with Bayesian probabilistic networks on GPCPU devices. This is achieved by the specialized organization of data in the memory of these devices and by the optimized parallel process to produce results faster. The efficiency and performance increases commensurate with increasingly larger Bayesian probabilistic networks, i.e. the approach scales favorably with larger networks thereby making real-time probabilistic inference possible on large data sets and realistic applications.  </p>
Hypergeometric Solutions of Second Order Linear Differential Equations Dr. Mark van Hoeij 19-008 Brittany Ferraro <p>Researchers, scientists, and engineers across a wide range of fields of expertise depend on computer algebra systems to solve complex equations such as linear differential or difference equations. Solutions to these equations cannot often be expressed by a closed-form expression. Functions are in closed form if they are written in terms of commonly used functions such as exp and log. Using computer systems to solve equations that cannot be expressed by a closed-form expression leads to an error or no solution being found. This can lead one to believe that there is no solution to the equation, even if this is not the case.</p> <p>Dr. Mark van Hoeij has extensively research unsolvable equations such as second order linear differential equations. He has created a novel algorithm to find closed form solutions to previously unsolvable equations. This state of the art algorithm can be implemented into current computer algebra systems.</p> <h2><strong>Advantages</strong></h2> <ul> <li>State of the art algorithm to find solutions not readily available</li> </ul> <ul> <li>Easily integrated with current software</li> </ul> mathematics,software,analytics,analytical software,algebra,computer system,code
Interactive Large Scale Data Profiling Mikhail Gubanov 21-021 Michael Tentnowski <p>Data profiling is a set of statistical data analysis activities and processes to determine properties of a given dataset.  A dataset has millions of tables, where their metadata (i.e. titles, attribute names and types) becomes abundant, similar to data instances and its profiling.  WebLens is an interactive, scalable metadata profiler for large-scale structured data.  It is a new data structure-metadata-profile coupled with Machine/Deep-Learning models trained to construct it. It represents a metadata summary of a specific real-world object collected over millions of data sources. These profiles significantly simplify access to largescale structured datasets for scientists and end users.</p>
Diagnostic Elementary Reading Profile App Yaacov Petscher 17-011 Brittany Ferraro <p>This novel app uses empirical classification schemes via latent mixture model &amp; classification and regression tree analysis (CART) to classify students into profiles of readers based on their fluency performance in K-2 at the fall, winter, and spring. The empirically derived classification schemes (decision rules appended to this documentation) are generated based on the user input of a set of fluency scores. The purpose of the system is provide teachers, parents, school administrators and students a set of recommended practices for instruction based on empirical classifications. The current state of score profiling is such that the teacher is supposed to group students based on performance of one assessment, yet when the student is administered a group of assessments (n &gt; 1), it is difficult to 1) reliability group students together, 2) group students in a manner that is valid, 3) make rapid sense of the relative strengths and weaknesses of student reading scores and 4) provide appropriate instruction and/or remediation based on the groupings. By using the Diagnostic Elementary Reading Profile app, students will be automatically sorted into empirically derived groupings at any given time-point during kindergarten through second grade. This will reduce assessment and work time for the teacher as the sorting and recommendations will occur automatically. The appended decision rules were normed on a set of 60,000 students that are nationally representative in terms of race/ethnicity, achievement, socio-economic status, and English language status. A major advantage is the app automatically classifies students into reliable and valid groups for instructional purposes in the K-2 classrooms. Present workarounds in the field are largely theoretically drive guesses without data-drive support. The novel feature is the use of mixture modeling along with classification and regression tree (CART) to empirically define the rules for classifying students into profiles.</p>
Computer Adaptive Testing Simulation and Analysis Based on Item Response Theory Cody Diefenthaler 17-027 & 17-028 Brittany Ferraro <p>WebCatCore is a novel framework for Computer Adaptive Testing (CAT) deliverable via the web. This tool takes an input item pool (such as 1P, 2P, and 3P models) and custom test criteria (such as stop rules, standard error thresholds, etc.) and delivers items for administration based on Item Response Theory (IRT) methodology. This easily configurable, plug and play system is an ideal solution for integrating Computer Adaptive Testing into web learning platforms. WebCatCore handles any customization requirements needed by providing configurable variables and consumable endpoints. The logic, algorithms, and methodology are all self-contained within the framework, making the porting process straightforward. Using WebCatCore will drastically reduce development time and cost.</p> <p>WebCatSlim is a novel web platform which works seamlessly with WebCatCore. This platform simulates and analyzes item pools designed for CAT. This platform loads administering item pools based on custom test criteria, and simulates the administration of the test across a user-defined range of simulated abilities. Once complete, the simulations are analyzed according to industry-standard test evaluation metrics, then visualized into grades, charts, and tables. The simulation results can be exported into data files which can be loaded at a later date. The simulation analysis visualizations support print functionality. WebCatSlim provides a way for researchers and test evaluators to simulate administration regardless of operating system. Additionally, the visualization of graphics, charts, and tables allows experts and non-experts to easily understand and analyze simulated test performance metrics.</p>