Resumen de: US2024281640A1
Apparatuses, systems, and methods relate to technology to identify temporal logic that is associated with a controller of a physical system simulation, where the controller a first neural network. The technology generates a second neural network based on the temporal logic, and generates, with the second neural network, a robustness metric of the first neural network.
Resumen de: US2024281648A1
A computer-implemented method, system and computer program product for performing semantic matching in a data fabric. Knowledge graphs are populated with metadata enriched with master data. Based on such knowledge graphs with the metadata enriched with the master data, a trained multi-layer graph neural network generates embeddings. Furthermore, behavioral metadata from data stewards are monitored and collected. Such behavioral metadata may be used to enrich metadata, which are populated in knowledge graphs which are inputted into the multi-layer graph neural network to generate embeddings. Upon generating the embeddings discussed above, semantic matching of the data assets in the data fabric using the embeddings is performed. In this manner, semantic matching of the data assets in the data fabric is more effectively performed by utilizing master data and behavioral metadata.
Resumen de: US2024281657A1
Disclosed herein is the framework of causal cooperative networks that discovers the causal relationship between observational data in a dataset and a label of the observation thereof and trains each model with inference of a causal explanation, reasoning, and production. In the case of the supervised learning, neural networks are adjusted through the prediction of the label for observation inputs. On the other hand, a causal cooperative network that includes the explainer, a reasoner, and a producer neural network models, receives an observation and a label as a pair, results multiple outputs, and calculates a set of losses of inference, generation, and reconstruction from the input and the outputs. The explainer, the reasoner, and the producer are adjusted by error propagation for each model obtained from the set of losses.
Resumen de: US2024281431A1
A method of labeling training data includes inputting a plurality of unlabeled input data samples into each of a plurality of pre-trained neural networks and extracting a set of feature embeddings from multiple layer depths of each of the plurality of pre-trained neural networks. The method also includes generating a plurality of clusterings from the set of feature embeddings. The method also includes analyzing, by a processing device, the plurality of clusterings to identify a subset of the plurality of unlabeled input data samples that belong to a same unknown class. The method also includes assigning pseudo-labels to the subset of the plurality of unlabeled input data samples.
Resumen de: US2024282098A1
Provided are an inspection method, a classification method, a management method, a steel material production method, a learning model generation method, a learning model, an inspection device, and steel material production equipment that can both improve detection accuracy and reduce processing time. The inspection method is an inspection method of detecting surface defects on an inspection target, the inspection method including: an imaging step (S1) of acquiring an image of a surface of the inspection target; an extraction step (S3) of extracting defect candidate parts from the image; a screening step (S4) of screening the extracted defect candidate parts by a first defect determination; and an inspection step (S5) of detecting harmful or harmless surface defects by a second defect determination using a convolutional neural network, the second defect determination being targeted at defect candidate parts after the screening by the first defect determination.
Resumen de: US2024281658A1
Disclosed is a new location threat monitoring solution that leverages deep learning (DL) to process data from data sources on the Internet, including social media and the dark web. Data containing textual information relating to a brand is fed to a DL model having a DL neural network trained to recognize or infer whether a piece of natural language input data from a data source references an address or location of interest to the brand, regardless of whether the piece of natural language input data actually contains the address or location. A DL module can determine, based on an outcome from the neural network, whether the data is to be classified for potential location threats. If so, the data is provided to location threat classifiers for identifying a location threat with respect to the address or location referenced in the data from the data source.
Resumen de: US2024281422A1
An auto-encoder model processes a datasets describing a physical part from a part catalogue in the form of a property co-occurrence graph is provided, and performs entity resolution and auto-completion on the co-occurrence graph in order to compute a corrected and/or completed dataset. The encoder includes a recurrent neural network and a graph attention network. The decoder contains a linear decoder for numeric values and a recurrent neural network decoder for strings. The auto-encoder model provides an automated end-to-end solution that can auto-complete missing information as well as correct data errors such as misspellings or wrong values. The auto-encoder model is capable of auto-completion for highly unaligned part specification data with missing values.
Nº publicación: US2024281662A1 22/08/2024
Solicitante:
GEORGIA TECH RES INST [US]
Georgia Tech Research Corporation
Resumen de: US2024281662A1
An exemplary co-processor system and method includes a solver circuit with a neural network configured for unsupervised learning, the solver circuit including: a binary input interface configured to receive binary inputs corresponding to variables and clauses for a Boolean problem; a state machine memory module; neuron circuits coupled to the state machine memory module; a neural network machine memory module having arrays of weights corresponding to nodes in a neural network; a state machine circuit operably coupled to state machine memory module, the binary input interface, and the neural network memory module, where the state machine circuit is configured to (i) compute a score from the Boolean states of the clauses (ii) determine a plurality of learning probabilities to generate a plurality of weights; and (iii) provide the plurality of weights to NN memory module, where the weights of the state machine memory module are iteratively updated through unsupervised learning.