Skip to content

usdaed

  • Science
  • Who is Denis Avetissian?

Science

Faster Aerodynamic Design with Graph Networks and Smart Data

27.12.2025 by qfx

The study demonstrates that test Mean Squared Error (testMSE) scales predictably with training set size ([latex]D_D[/latex]), where [latex]D_D[/latex] represents the number of unique geometry-flow snapshots used as graph-based data, and this scaling behavior differs significantly across models of varying sizes.

Researchers have created a new dataset and scaling laws to accelerate aerodynamic simulations using graph neural networks, enabling efficient design even with limited data.

Categories Science

When Conversations Get Confused: A New Test for Chatbot Clarity

27.12.2025 by qfx

Effective communication between users and large language models hinges on clarifying ambiguous or contradictory input, as demonstrated by the ability of follow-up questioning to resolve initial uncertainties and ensure alignment with user intent.

Researchers have created a benchmark and framework to help conversational AI better navigate ambiguity and ask clarifying questions during extended dialogues.

Categories Science

Beyond the Network Boundary: Adapting Traffic Analysis to New Environments

27.12.2025 by qfx

Domain characteristics vary considerably across network environments, with the Campus domain exhibiting the highest proportion of elephant flows-reaching 15.0%-while the UNSW-NB15 dataset provides the most extensive data for analysis, comprising 82,332 flows.

Detecting unusually large network traffic flows – ‘elephant flows’ – becomes significantly harder when models are moved between different network setups, and this research tackles that challenge.

Categories Science

When Memories Fade: Understanding Forgetting in AI

27.12.2025 by qfx

New research reveals how the depth of knowledge representation impacts a model’s ability to retain information when learning new tasks, offering a path towards more robust artificial intelligence.

Categories Science

What Large Language Models Still Don’t Know

26.12.2025 by qfx

The Competency Gap method decomposes large language model evaluation into interpretable benchmark and model gaps by leveraging a concept dictionary learned through sparse autoencoding, quantifying both how much benchmarks activate individual concepts and projecting model performance into concept space to yield per-concept scores across benchmarks and evaluation suites.

A new method reveals critical weaknesses in today’s most powerful AI systems and highlights shortcomings in how we measure their abilities.

Categories Science

Squeezing Value from Spot Instances for Large Language Model Training

26.12.2025 by qfx

Time series analysis using the ARIMA model effectively forecasts both spot availability and price fluctuations.

New research details a smart scheduling framework that minimizes costs and meets deadlines when fine-tuning massive AI models using fluctuating cloud GPU pricing.

Categories Science

Strength in Numbers: A New Defense Against AI Attacks

26.12.2025 by qfx

The architecture decomposes complex functions into specialized sub-networks - the ‘experts’ - and a gating mechanism dynamically routes inputs to these experts, allowing the system to adapt its capacity and maintain performance even as demands shift and decay over time, a strategy mirroring the graceful degradation observed in resilient systems.

A novel system leveraging a mixture of experts significantly improves the robustness of machine learning models against carefully crafted adversarial inputs.

Categories Science

The Deep Learning Scaling Puzzle: Why Bigger Isn’t Always Better

26.12.2025 by qfx

Internal feature learning in deep residual networks collapses with increasing depth-at a rate of [latex] 1/\sqrt{L} [/latex]-but this degradation is rectified by a depth-aware learning rate, [latex] \eta_1 = \eta_c n \sqrt{L} [/latex], which restores active learning across layers and enables consistent hyperparameter transfer and improved performance, as demonstrated by lower training and testing losses and higher accuracy even with varying network depths and widths.

New research reveals how the dynamics of feature learning in deep neural networks explain both the successes and limitations of simply scaling up model size.

Categories Science

Giving Graphs a Voice: Enriching Data with Language Models

26.12.2025 by qfx

The system iteratively refines node descriptions within a closed loop, leveraging a graph neural network (GNN) to provide task feedback and a model-conditioned memory to retrieve relevant in-graph exemplars-guiding a large language model (LLM) to update node semantics before these are fed back into the GNN for continuous improvement [latex] \rightarrow [/latex].

A new approach leverages the power of large language models to refine the semantic understanding of nodes within graph structures, leading to improved performance and adaptability.

Categories Science

Decoding Problem Difficulty: A New Approach for Combinatorial Optimization

26.12.2025 by qfx

Combinatorial optimization problems defined on graph structures encompass a diverse range of challenges, fundamentally categorized by constraints on node and edge variables - such as those maximizing flow through a network [latex] G = (V, E) [/latex], minimizing the cost of traversing a graph, or satisfying complex relationships between interconnected elements - ultimately requiring algorithms to navigate this landscape of possibilities and identify provably optimal solutions.

Researchers have developed a framework to predict how challenging a graph-based problem will be, offering insights into its inherent complexity.

Categories Science
Older posts
Newer posts
← Previous Page1 … Page94 Page95 Page96 … Page144 Next →
© 2026 usdaed • Built with GeneratePress