Dissertation Defense Schedule

Academic Excellence

Sharing original dissertation research is a principle to which the University of Delaware is deeply committed. It is the single most important assignment our graduate students undertake and upon completion is met with great pride.

We invite you to celebrate this milestone by attending their dissertation defense. Please review the upcoming dissertation defense schedule below and join us!

Dissertation Defense Form

Must be received two weeks prior to your defense.

Join Us

Celebrate your colleague’s academic success!

It's official

Download the official UD thesis/dissertation manual.

Dissertation Discourse

Need a creative jumpstart?

PROGRAM | Financial Services Analytics

Hypergraph Neural Networks: From Signal Processing to Convolution, U-Nets and Beyond

By: Fuli Wang Chair: Gonzalo Arce Co-Chair: Wei Qian

ABSTRACT

Abstract: Network data has gained significant attention in the signal processing and machine learning communities. Existing research mainly centers on simple graphs, which depict only pairwise connections. This limitation becomes apparent in many real-world scenarios, such as social networks where interactions often involve multiple individuals, or in recommender systems where decisions are influenced by groups of interrelated purchase histories. In response to this limitation, we turn our focus to a more advanced and general data abstraction: the hypergraph. In a hypergraph, each hyperedge can simultaneously bind multiple nodes, offering a richer representation of relationships. Building on the foundational principles of hypergraph signal processing (HGSP), we develop a series of hypergraph neural networks (HyperGNNs) ranging from convolution to u-nets and beyond, which are applicable to node-level, hyperlink-level, and hypergraph-level tasks.

 

The first study embraces recent advances in tensor-HGSP and proposes a tensor-HyperGNN framework, including t-spectral convolution, t-spatial convolution, and t-message-passing layers. The t-spectral convolution is defined under the t-product algebra from the spectral filtering theorem. To improve computational efficiency for large hypergraphs, we localize the t-spectral convolution approach to formulate the t-spatial convolution and further devise a novel tensor-message-passing algorithm for practical implementation by facilitating compressed adjacency tensor representations. The proposed models leverage hypergraph tensor descriptors that preserve complete higher-order relationships without reducing the hypergraph structure, thus allowing for the full exploitation of hypergraph data. Furthermore, we introduce a cross-node interaction tensor to capture polynomial interactions of node features, significantly enhancing the capacity to understand intrinsic higher-order relationships beyond traditional linear aggregation approaches.

 

Observing the interplay between HGSP and HyperGNNs, we continue to study the connection between a classic hypergraph signal denoising (hyperGSD) problem and hypergraph convolution layers. Interestingly, we theoretically prove that constructing a hypergraph convolution layer is equivalent to performing a one-step gradient descent to solve the HyperGSD problem. Inspired by this intriguing discovery, we propose a multi-step gradient descent rule and further design a tensor-hypergraph iterative network (T-HGIN)  based on the multi-step updating scheme. Compared to the original tensor-hypergraph convolution networks (T-HGCNs), the T-HGIN efficiently propagates information to a larger neighborhood in just one layer, thus achieving better performance with fewer learnable parameters. 

 

While convolutions have been successfully developed in non-Euclidean higher-order domains, particularly in hypergraphs, the exploration of u-net architectures in hypergraph data remains sparse due to the lack of well-defined pooling and unpooling operations. The last work in this dissertation pioneers the study of u-net architectures for hypergraph data, addressing the critical challenge of designing effective pooling and unpooling operations that retain maximal structural information from the input hypergraph. Motivated by hierarchical clustering, we propose to construct the pooling and unpooling operators all at once by cutting the clustering dendrogram at different granularities, coined the Parallel Hierarchical Pooling (PH-Pool) and Unpooling (PH-Unpool) operators. Unlike existing pooling methods that risk local structural damage through a sequential learning procedure, our PH-Pool operators are designed in a global and parallel manner to ensure fidelity to the original hypergraph structure with efficient computation. The PH-Unpool operators are tailored to perform inverse operations of the PH-Pools for hypergraph reconstruction, leading to the development of hypergraph u-nets. We validate the proposed hypergraph u-net through hypergraph reconstruction simulation, hypergraph classification, and self-supervised anomaly detection, which demonstrate superior performance over state-of-the-art graph and hypergraph deep learning methods.

Back >

The Process

Step-by-Step

Visit our “Step-by-Step Graduation Guide” to take you through the graduation process.From formatting your Dissertation to Doctoral Hooding procedures.

Your First Step >

Dissertation Manual

Wondering how to set up the format for your paper. Refer to the “UD Thesis/Dissertation Manual” for formatting requirements and more.

Download Your Manual >

Defense Submission Form

This form must be completed two weeks in advance of a dissertation defense to meet the University of Delaware Graduate and Professional Education’s requirements.

Submission Form >