- Magnetic topology (Magnetic topology)

by Emmanouil Markoulakis19 Sep 2020 15:08 - Magnetization (Magnetization)

by Emmanouil Markoulakis18 Sep 2020 03:17 - Display calculus (Display calculus)

by Valentin D. Richard16 Jul 2020 21:35 - P.R. Firms and News agency's must know the truth about Samuel Groft and his Royal DNA (P.R. Firms and News agency's must know the truth about Samuel Groft and his Royal DNA)

by Samuel Groft16 Jun 2020 14:15 - Save Samuel Groft in Los Angeles from evil torture in America (Save Samuel Groft in Los Angeles from evil torture in America)

by Samuel Groft16 Jun 2020 14:11 - Grey-body radiation (Grey-body radiation)

by Prof. Carlo Beenakker08 Jan 2011 20:36 - Kaluza-Klein dark matter (Kaluza-Klein dark matter)

by Dr. Geraldine Servant05 Dec 2010 22:13 - Gravitational lensing (Gravitational lensing)

by Prof. Koen Kuijken05 Dec 2010 22:11 - Quantum shot noise (Quantum shot noise)

by Prof. Carlo Beenakker04 Feb 2014 08:52 - Dzyaloshinskii-Moriya interaction (Dzyaloshinskii-Moriya interaction)

by Dr. George Jackeli28 Aug 2009 09:41

- We formalize the simulation paradigm of cryptography in terms of category theory and show that protocols secure against abstract attacks form a symmetric monoidal category, thus giving an abstract model of composable security definitions in cryptography. Our model is able to incorporate computational security, set-up assumptions and various attack models such as colluding or independently acting subsets of adversaries in a modular, flexible fashion. We conclude by using string diagrams to rederive no-go results concerning the limits of bipartite and tripartite cryptography, ruling out e.g. composable commitments and broadcasting. On the way, we exhibit two categorical constructions of resource theories that might be of independent interest: one capturing resources shared among n parties and one capturing resource conversions that succeed asymptotically.CryptographySecurityAttacker modelMorphismCategory theoryAlgebraic theoryQuantum key distributionCategory of elementsTuring machineComputer science...
- Magnetic field-line reconnection is a universal plasma process responsible for the conversion of magnetic field energy to the plasma heating and charged particle acceleration. Solar flares and Earth's magnetospheric substorms are two most investigated dynamical systems where magnetic reconnection is believed to be responsible for global magnetic field reconfiguration and energization of plasma populations. Such a reconfiguration includes formation of a long-living current systems connecting the primary energy release region and cold dense conductive plasma of photosphere/ionosphere. In both flares and substorms the evolution of this current system correlates with formation and dynamics of energetic particle fluxes. Our study is focused on this similarity between flares and substorms. Using a wide range of datasets available for flare and substorm investigations, we compare qualitatively dynamics of currents and energetic particle fluxes for one flare and one substorm. We showed that there is a clear correlation between energetic particle bursts (associated with energy release due to magnetic reconnection) and magnetic field reconfiguration/formation of current system. We then discuss how datasets of in-situ measurements in the magnetospheric substorm can help in interpretation of datasets gathered for the solar flare.Solar flareEarthIonosphereSoft X-rayHard X-rayPhotosphereX-ray spectrumMagnetic reconnectionMagnetometerSolar Dynamics Observatory...
- In this essay we will discuss the concepts of \emph{topos}(place), \emph{sunekhes} (the continuum) and \emph{infinitesimal} such as found in classical philosophy (primarily in Aristotle's \emph{Physics}) and their relationship to certain concepts of modern topology. Such an interconnection of the ancient and the modern was at the heart of the thought of Ren\'e Thom, in particular in his \emph{Esquisse d'une S\'emiophysique}. We will discuss how the apparently modern (Leibnizian) concept of infinitesimal has its roots in classical thought and how the three concepts of topos, continuity and infinitesimal are intimately connected and can be given precise mathematical definitions in modern topology and are relevant to modern problems in the foundations of mathematics and philosophy of science.InfinitesimalFoundations of mathematicsTopology
- We propose a categorical framework for processes which interact bidirectionally with both an environment and a 'controller'. Examples include open learners, in which the controller is an optimiser such as gradient descent, and an approach to compositional game theory closely related to open games, in which the controller is a composite of game-theoretic agents. We believe that 'cybernetic' is an appropriate name for the processes that can be described in this framework.MorphismSelection functionGame theoryNeural networkPrisoner's dilemmaApexGenerative Adversarial NetInformation flowNash equilibriumIsomorphism...
- Synchrotron radiation from hot gas near a black hole results in a polarized image. The image polarization is determined by effects including the orientation of the magnetic field in the emitting region, relativistic motion of the gas, strong gravitational lensing by the black hole, and parallel transport in the curved spacetime. We explore these effects using a simple model of an axisymmetric, equatorial accretion disk around a Schwarzschild black hole. By using an approximate expression for the null geodesics derived by Beloborodov (2002) and conservation of the Walker-Penrose constant, we provide analytic estimates for the image polarization. We test this model using currently favored general relativistic magnetohydrodynamic simulations of M87*, using ring parameters given by the simulations. For a subset of these with modest Faraday effects, we show that the ring model broadly reproduces the polarimetric image morphology. Our model also predicts the polarization evolution for compact flaring regions, such as those observed from Sgr A* with GRAVITY. With suitably chosen parameters, our simple model can reproduce the EVPA pattern and relative polarized intensity in Event Horizon Telescope images of M87*. Under the physically motivated assumption that the magnetic field trails the fluid velocity, this comparison is consistent with the clockwise rotation inferred from total intensity images.Messier 87IntensityBlack holeGeodesicGeneral relativistic magnetohydrodynamicOrientationSynchrotron radiationFaraday effectAccretion diskSynchrotron...
- Simulating the irradiation of planetary atmospheres by cosmic ray particles requires, among others, the ability to understand and to quantify the interactions of charged particles with planetary magnetic fields. Here we present a process that is very often ignored in such studies; the dispersion and focusing of cosmic ray trajectories in magnetospheres. The calculations were performed using our new code CosmicTransmutation, which has been developed to study cosmogenic nuclide production in meteoroids and planetary atmospheres and which includes the computation of the irradiation spectrum on top of the atmosphere. Here we discuss effects caused by dispersion and focusing of cosmic ray particle trajectories.FunnelingCosmic rayEarthGeomagnetic fieldsTop-of-the-atmospherePenumbraZenithGalactic cosmic raysCosmogenic nuclideBESS...
- 3D object segmentation is a fundamental and challenging problem in computer vision with applications in autonomous driving, robotics, augmented reality and medical image analysis. It has received significant attention from the computer vision, graphics and machine learning communities. Traditionally, 3D segmentation was performed with hand-crafted features and engineered methods which failed to achieve acceptable accuracy and could not generalize to large-scale data. Driven by their great success in 2D computer vision, deep learning techniques have recently become the tool of choice for 3D segmentation tasks as well. This has led to an influx of a large number of methods in the literature that have been evaluated on different benchmark datasets. This paper provides a comprehensive survey of recent progress in deep learning based 3D segmentation covering over 150 papers. It summarizes the most commonly used pipelines, discusses their highlights and shortcomings, and analyzes the competitive results of these segmentation methods. Based on the analysis, it also provides promising research directions for the future.Deep learningImage ProcessingEngineeringAttentionAutonomous drivingRoboticsMachine learningSurveysObject...
- Context. It is now well-established that small, rocky planets are common around low-mass stars. However, the detection of such planets is challenged by the short-term activity of the host stars. Aims. The HArps-N red Dwarf Exoplanet Survey (HADES) program is a long-term project at the Telescopio Nazionale Galileo aimed at the monitoring of nearby, early-type, M dwarfs, using the HARPS-N spectrograph to search for small, rocky planets. Methods. A total of 174 HARPS-N spectroscopic observations of the M0.5V-type star GJ 9689 taken over the past seven years have been analysed. We combined these data with photometric measurements to disentangle signals related to the stellar activity of the star from possible Keplerian signals in the radial velocity data. We run an MCMC analysis, applying Gaussian Process regression techniques to model the signals present in the data. Results. We identify two periodic signals in the radial velocity time series, with periods of 18.27 d, and 39.31 d. The analysis of the activity indexes, photometric data, and wavelength dependency of the signals reveals that the 39.31 d signal corresponds to the stellar rotation period. On the other hand, the 18.27 d signal shows no relation to any activity proxy or the first harmonic of the rotation period. We, therefore, identify it as a genuine Keplerian signal. The best-fit model describing the newly found planet, GJ 9689 b, corresponds to an period P$_{\rm b}$ = 18.27 $\pm$ 0.01 d, and a minimum mass M$_{\rm P}\sin i$ = 9.65 $\pm$ 1.41 M$_{\oplus}$.PlanetM dwarfsStarRadial velocityHigh accuracy radial velocity planetary searchStellar rotationSuper-earthTime SeriesHost starBayesian information criterion...
- Numerical models have shown that disc dispersal via internal photoevaporation driven by the host star can successfully reproduce the observed pile-up of warm Jupiters near 1-2 au. However, since a range of different mechanisms have been proposed to cause the same feature, clear observational diagnostics of disc dispersal leaving an imprint in the observed distribution of giant planets could help constraining the dominant mechanisms. We aim to assess the impact of disc dispersal via X-ray driven photoevaporation (XPE) onto giant planet separations in order to provide theoretical constraints on the location and size of any possible features related to this process within their observed orbital distribution. For this purpose, we perform a set of 1D population syntheses with varying initial conditions and correlate the gas giants' final parking locations with the X-ray luminosities of their host stars in order to quantify observables of this process within the $a$-$L_\mathrm{x}$-plane of these systems. We find that XPE indeed creates an underdensity of gas giants near the gravitational radius, with corresponding pile-ups in- and/or outside of this location. However, the size and location of these features are strongly dependent on the choice of initial conditions in our model, such as the assumed formation location of the planets. XPE can strongly affect the migration process of giant planets and leave potentially observable signatures within the observed orbital separations of giant planets. However, due to the simplistic approach employed in our model, which lacks a self-consistent treatment of planet formation within an evolving disc, a quantitative analysis of the final planet population orbits is not possible. Our results however strongly motivate future studies to include realistic disc dispersal mechanisms into global planet population synthesis models.PlanetPhotoevaporationGiant planetX-ray luminosityPile-upCosmic voidPlanet formationAccretionHost starStar...
- HD 106906 is a young, binary stellar system, located in the Lower Centaurus Crux (LCC) group. This system is unique among discovered systems in that it contains an asymmetrical debris disk, as well as an 11 M$_{Jup}$ planet companion, at a separation of $\sim$735 AU. Only a handful of other systems are known to contain both a disk and directly imaged planet, where HD 106906 is the only one in which the planet has apparently been scattered. The debris disk is nearly edge on, and extends roughly to $>$500 AU, where previous studies with HST have shown the outer regions to have high asymmetry. To better understand the structure and composition of the disk, we have performed a deep polarimetric study of HD 106906's asymmetrical debris disk using newly obtained $H$-, $J$-, and $K1$-band polarimetric data from the Gemini Planet Imager (GPI). An empirical analysis of our data supports a disk that is asymmetrical in surface brightness and structure, where fitting an inclined ring model to the disk spine suggests that the disk may be highly eccentric ($e\gtrsim0.16$). A comparison of the disk flux with the stellar flux in each band suggests a blue color that also does not significantly vary across the disk. We discuss these results in terms of possible sources of asymmetry, where we find that dynamical interaction with the planet companion, HD 106906b, is a likely candidate.Dust grainPlanetAstronomical UnitDebris discStarSurface brightnessFull width at half maximumEccentricityCompanionHubble Space Telescope...
- Thermal phase variations of short period planets indicate that they are not spherical cows: day-to-night temperature contrasts range from hundreds to thousands of degrees, rivaling their vertical temperature contrasts. Nonetheless, the emergent spectra of short-period planets have typically been fit using one-dimensional (1D) spectral retrieval codes that only account for vertical temperature gradients. The popularity of 1D spectral retrieval codes is easy to understand: they are robust and have a rich legacy in Solar System atmospheric studies. Exoplanet researchers have recently introduced multi-dimensional retrieval schemes for interpreting the spectra of short-period planets, but these codes are necessarily more complex and computationally expensive than their 1D counterparts. In this paper we present an alternative: phase-dependent spectral observations are inverted to produce longitudinally resolved spectra that can then be fitted using standard 1D spectral retrieval codes. We test this scheme on the iconic phase-resolved spectra of WASP-43b and on simulated JWST observations using the open-source pyratbay 1D spectral retrieval framework. Notably, we take the model complexity of the simulations one step further over previous studies by allowing for longitudinal variations in composition in addition to temperature. We show that performing 1D spectral retrieval on longitudinally resolved spectra is more accurate than applying 1D spectral retrieval codes to disk-integrated emission spectra, despite being identical in terms of computational load. We find that for the extant Hubble and Spitzer observations of WASP-43b the difference between the two approaches is negligible but that JWST phase measurements should be treated with longitudinally \textbf{re}solved \textbf{spect}ral retrieval (ReSpect).PlanetJames Webb Space TelescopeOpen sourceSolar systemSpitzer Space TelescopeExtrasolar planetTemperatureSimulationsMeasurement...
- We present arcminute-resolution intensity and polarization maps of the Galactic center made with the Atacama Cosmology Telescope (ACT). The maps cover a 32 deg$^2$ field at 98, 150, and 224 GHz with $\vert l\vert\le4^\circ$, $\vert b\vert\le2^\circ$. We combine these data with Planck observations at similar frequencies to create coadded maps with increased sensitivity at large angular scales. With the coadded maps, we are able to resolve many known features of the Central Molecular Zone (CMZ) in both total intensity and polarization. We map the orientation of the plane-of-sky component of the Galactic magnetic field inferred from the polarization angle in the CMZ, finding significant changes in morphology in the three frequency bands as the underlying dominant emission mechanism changes from synchrotron to dust emission. Selected Galactic center sources, including Sgr A*, the Brick molecular cloud (G0.253+0.016), the Mouse pulsar wind nebula (G359.23-0.82), and the Tornado supernova remnant candidate (G357.7-0.1), are examined in detail. These data illustrate the potential for leveraging ground-based Cosmic Microwave Background polarization experiments for Galactic science.Atacama Cosmology TelescopeGalactic CenterPlanck missionIntensityCentral Molecular ZoneOrientationLine of sightSynchrotron radiationGalactic planeDust emission...
- Lenses are an important tool in applied category theory. While individual lenses have been widely used in applications, many of the mathematical properties of the corresponding categories of lenses have remained unknown. In this paper, we study the category of small categories and asymmetric delta lenses, and prove that it has several good exactness properties. These properties include the existence of certain limits and colimits, as well as so-called imported limits, such as imported products and imported pullbacks, which have arisen previously in applications. The category is also shown to be extensive, and it has an image factorisation system.MonomorphismEpimorphismFactorisationMorphismCommutative diagramDiscrete categoryCategory theoryPartially ordered setUniversal propertyIsomorphism...
- Recent self-supervised representation learning techniques have largely closed the gap between supervised and unsupervised learning on ImageNet classification. While the particulars of pretraining on ImageNet are now relatively well understood, the field still lacks widely accepted best practices for replicating this success on other datasets. As a first step in this direction, we study contrastive self-supervised learning on four diverse large-scale datasets. By looking through the lenses of data quantity, data domain, data quality, and task granularity, we provide new insights into the necessary conditions for successful self-supervised learning. Our key findings include observations such as: (i) the benefit of additional pretraining data beyond 500k images is modest, (ii) adding pretraining images from another domain does not lead to more general representations, (iii) corrupted pretraining images have a disparate impact on supervised and self-supervised pretraining, and (iv) contrastive learning lags far behind supervised learning on fine-grained visual classification tasks.Contrastive learningSelf-supervised learningClassifierHyperparameterSupervised learningSchedulingCoarse grainingTaxonomyFeature learningRemote sensing...
- This paper displays the Healy-McInnes UMAP construction $V(X,N)$ as an iterated pushout of Vietoris-Rips objects associated to extended pseudo metric spaces (ep-metric spaces) defined by choices of neighbourhoods of the elements of a finite set $X$. An inclusion $X \subset Y$ in another finite set defines a map of UMAP systems $V(X,N) \to V(Y,N')$ in the presence of a compatible system of neighbourhoods $N'$ for $Y$. There is also an induced map of ep-metric spaces $(X,D) \to (Y,D')$, where $D$ and $D'$ are colimits (global averages) of the metrics defined by the neighbourhood systems for $X$ and $Y$. We prove a stablity result for the restriction of this ep-metric space map to global components. This stability result translates, via excision for path components, to a stability result for global components of the UMAP systems.Metric spaceMorphismPartially ordered setIsomorphismMonomorphismCommutative diagramProgrammingSimplicial setTriangle inequalityObject...
- The growth of science and technology is a recombinative process, wherein new discoveries and inventions are built from prior knowledge. Yet relatively little is known about the manner in which scientific and technological knowledge develop and coalesce into larger structures that enable or constrain future breakthroughs. Network science has recently emerged as a framework for measuring the structure and dynamics of knowledge. While helpful, existing approaches struggle to capture the global properties of the underlying networks, leading to conflicting observations about the nature of scientific and technological progress. We bridge this methodological gap using tools from algebraic topology to characterize the higher-order structure of knowledge networks in science and technology across scale. We observe rapid growth in the higher-order structure of knowledge in many scientific and technological fields. This growth is not observable using traditional network measures. We further demonstrate that the emergence of higher-order structure coincides with decline in lower-order structure, and has historically far outpaced the corresponding emergence of higher-order structure in scientific and technological collaboration networks. Up to a point, increases in higher-order structure are associated with better outcomes, as measured by the novelty and impact of papers and patents. However, the nature of science and technology produced under higher-order regimes also appears to be qualitatively different from that produced under lower-order ones, with the former exhibiting greater linguistic abstractness and greater tendencies for building upon prior streams of knowledge.SubcategoryPersistent homologyGraphPart-of-speechRandom graphAlgebraic topologyNetwork scienceWeighted networkPreferential attachmentAttention...
- We give diagrammatic tools to reason about information flow within encrypted communication. In particular, we are interested in deducing where information flow (communication or otherwise) has taken place, and fully accounting for all possible paths. The core mathematical concept is using a single categorical diagram to model the underlying mathematics, the epistemic knowledge of the participants, and (implicitly) the potential or actual communication between participants. A key part of this is a `correctness' or `consistency' criterion that ensures we accurately & fully account for the distinct routes by which information may come to be known (i.e. communication and / or calculation). We demonstrate how this formalism may be applied to answer questions about communication scenarios where we have the partial information about the participants and their interactions. Similarly, we show how to analyse the consequences of changes to protocols or communications, and to enumerate the distinct orders in which events may have occurred. We use various forms of Diffie-Hellman key exchange as an illustration of these techniques. However, they are entirely general; we illustrate in an appendix how other protocols from non-commutative cryptography may be analysed in the same manner.Information flowPartially ordered setSecurityAlice and BobCryptographyLattice (order)Enriched categoryCategory theoryMonoidLinear Logic...
- In this paper we will introduce and give topological properties of a new concept named simplicial distance which is the simplicial analog of the homotopic distance (in the sense of Marcias-Virgos and Mosquera-Lois in their paper [6]). According to our definition of simplicial distance, simplicial complexity is a particular case of this new concept.Path spaceFibrationBorate
- Mereology is the study of parts and the relationships that hold between them. We introduce a behavioral approach to mereology, in which systems and their parts are known only by the types of behavior they can exhibit. Our discussion is formally topos-theoretic, and agnostic to the topos, providing maximal generality; however, by using only its internal logic we can hide the details and readers may assume a completely elementary set-theoretic discussion. We consider the relationship between various parts of a whole in terms of how behavioral constraints are passed between them, and give an inter-modal logic that generalizes the usual alethic modalities in the setting of symmetric accessibility.EcosystemsAlgebraic geometryGlassAttentionMeasuring devicesLattice (order)Time delayHomomorphismAlgebraic varietyPolynomial ring...
- We show that the natural algebraic structure of the singular chains on a path connected topological space determines the fundamental group functorially. Moreover, we describe a notion of weak equivalence for the relevant algebraic structure under which the data of the fundamental group is preserved.Quasi-isomorphismIsomorphismSimplicial setMorphismHopf algebraChain complexMonoidGroup algebraTopological categoryModel structure...
- Giry algebras are barycenters maps, which are coequalizers of contractible coequalizer pairs (like any algebras), and their existence, in general, requires the measurable space be coseparated by the discrete two point space, and the hypothesis that no measurable cardinals exist. Under that hypothesis, every measurable space which is coseparated has an algebra, and the category of Giry algebras provides a convenient setting for probability theory because it is a symmetric monoidal closed category with all limits and colimits, as well as having a seperator and coseperator. This is in stark contrast to the Kleisi category of the Giry monad, which is often used to model conditional probability, which has a seperator but not much else.MorphismBarycenterQuotient spaceSubcategoryEpimorphismTensor productPermutationIsomorphismEnriched categoryComplete theory...
- We present a set of principles and methodologies which may serve as foundations of a unifying theory of Mathematics. These principles are based on a new view of Grothendieck toposes as unifying spaces being able to act as `bridges' for transferring information, ideas and results between distinct mathematical theories.Morita equivalenceDualityMorphismCompletenessModel theoryFinitaryIsomorphismEmbeddingClassical mathematicsCohomology...
- This paper unifies different notions of simplification for graphs into a single universal construction on a comma category, given the familiar conditions of faithfulness, regularity, and existence of an adjoint. Specifically, this construction unifies the passage of a directed multigraph to a simple digraph, the passage of a set-system hypergraph to a simple set system, and the passage of an incidence hypergraph to a simple incidence structure. Moreover, this universal construction has a natural dual, a "cosimplification". This dual process unifies the removal of isolated vertices and loose edges for quivers and incidence hypergraphs, as well as the passage of a set-system hypergraph to a set system when using antihomomorphisms.HypergraphSubcategoryGraphComma categoryQuiverSimple graphHomomorphismMultigraphMorphismMonomorphism...
- In this paper, we consider the simplest class of stratified spaces -- linearly embedded graphs. We present an algorithm that learns the abstract structure of an embedded graph and models the specific embedding from a point cloud sampled from it. We use tools and inspiration from computational geometry, algebraic topology, and topological data analysis and prove the correctness of the identified abstract structure under assumptions on the embedding. The algorithm is implemented in the Julia package http://github.com/yossibokor/Skyler.jl , which we used for the numerical simulations in this paper.GraphEmbeddingManifoldPoint cloudAlgebraic topologyComputational geometryNumerical simulationLikelihood functionMaximum likelihood estimateEM algorithm...
#### Categories of Netsver. 2

We present a unified framework for Petri nets and various variants, such as pre-nets and Kock's whole-grain Petri nets. Our framework is based on a less well-studied notion that we call $\Sigma$-nets, which allow finer control over whether tokens are treated using the collective or individual token philosophy. We describe three forms of execution semantics in which pre-nets generate strict monoidal categories, $\Sigma$-nets (including whole-grain Petri nets) generate symmetric strict monoidal categories, and Petri nets generate commutative monoidal categories, all by left adjoint functors. We also construct adjunctions relating these categories of nets to each other, in particular showing that all kinds of net can be embedded in the unifying category of $\Sigma$-nets, in a way that commutes coherently with their execution semantics.MorphismPermutationGroupoidMonoidIsomorphismDifferential form of degree threeSymmetry groupSubgroupIsotopyGroupoid representations...- One way of interpreting a left Kan extension is as taking a kind of "partial colimit", whereby one replaces parts of a diagram by their colimits. We make this intuition precise by means of the "partial evaluations" sitting in the so-called bar construction of monads. The (pseudo)monads of interest for forming colimits are the monad of diagrams and the monad of small presheaves, both on the (huge) category CAT of locally small categories. Throughout, particular care is taken to handle size issues, which are notoriously delicate in the context of free cocompletion. We spell out, with all 2-dimensional details, the structure maps of these pseudomonads. Then, based on a detailed general proof of how the "restriction-of-scalars" construction of monads extends to the case of pseudoalgebras over pseudomonads, we define a morphism of monads between them, which we call "image". This morphism allows us in particular to generalize the idea of "confinal functors" i.e. of functors which leave colimits invariant in an absolute way. This generalization includes the concept of absolute colimit as a special case. The main result of this paper spells out how a pointwise left Kan extension of a diagram corresponds precisely to a partial evaluation of its colimit. This categorical result is analogous to what happens in the case of probability monads, where a conditional expectation of a random variable corresponds to a partial evaluation of its center of mass.MorphismIsomorphismKan extensionUniversal propertyGrothendieck constructionCommutative diagramCoherence conditionCategory of elementsFibrationEmbedding...
- We present a novel method of associating Euclidean features to simplicial complexes, providing a way to use them as input to statistical and machine learning tools. This method extends the node2vec algorithm to simplices of higher dimensions, providing insight into the structure of a simplicial complex, or into the higher-order interactions in a graph.GraphRandom walkFeature spacePoint cloudDBSCANMachine learningRand indexBc mesonAlgebraic topologyWord vectors...
- Inverse categories are categories in which every morphism x has a unique pseudo-inverse y in the sense that xyx=x and yxy=y. Persistence modules from topological data analysis and similarly decomposable category representations factor through inverse categories. This paper gives a numerical condition, decidable when the indexing category is finite, characterizing when a representation of a small category factors through an inverse category.MorphismMonoidPartially ordered setEndomorphismPartial isometryIsomorphismVector spaceGroupoidSubcategoryProjection operator...
- We develop the theory of derived differential geometry in terms of bundles of curved $L_\infty[1]$-algebras, i.e. dg manifolds of positive amplitudes. We prove the category of derived manifolds is a category of fibrant objects. Therefore, we can make sense of "homotopy fibered product" and "derived intersection" of submaifolds in a smooth manifold in the homotopy category of derived manifolds. We construct a factorization of the diagonal using path spaces. First we construct an infinite-dimensional factorization using actual path spaces motivated by the AKSZ construction, then we cut down to finite dimensions using the Fiorenza-Manetti method. The main ingredient is the homotopy transfer theorem for curved $L_\infty[1]$-algebras. We also prove the inverse function theorem for derived manifolds, and investigate the relationship between weak equivalence and quasi-isomorphism for derived manifolds.ManifoldMorphismVector bundleFibrationIsomorphismVector spacePath spaceQuasi-isomorphismGraded manifoldBundle...
- We show that hypernetworks can be regarded as posets which, in their turn, have a natural interpretation as simplicial complexes and, as such, are endowed with an intrinsic notion of curvature, namely the Forman Ricci curvature, that strongly correlates with the Euler characteristic of the simplicial complex. This approach, inspired by the work of E. Bloch, allows us to canonically associate a simplicial complex structure to a hypernetwork, directed or undirected. In particular, this greatly simplifying the geometric Persistent Homology method we previously proposed.Partially ordered setEuler characteristicCurvatureRankRicci tensorPersistent homologyGauss-Bonnet theoremDiscretizationGraphAttention...
- We develop a general noncommutative version of Balmer's tensor triangular geometry that is applicable to arbitrary monoidal triangulated categories (M$\Delta$Cs). Insight from noncommutative ring theory is used to obtain a framework for prime, semiprime, and completely prime (thick) ideals of an M$\Delta$C, ${\bf K}$, and then to associate to ${\bf K}$ a topological space--the Balmer spectrum $\operatorname{Spc} {\bf K}$. We develop a general framework for (noncommutative) support data, coming in three different flavors, and show that $\operatorname{Spc} {\bf K}$ is a universal terminal object for the first two notions (support and weak support). The first two types of support data are then used in a theorem that gives a method for the explicit classification of the thick (two-sided) ideals and the Balmer spectrum of an M$\Delta$C. The third type (quasi support) is used in another theorem that provides a method for the explicit classification of the thick right ideals of ${\bf K}$, which in turn can be applied to classify the thick two-sided ideals and $\operatorname{Spc} {\bf K}$. As a special case, our approach can be applied to the stable module categories of arbitrary finite dimensional Hopf algebras that are not necessarily cocommutative (or quasitriangular). We illustrate the general theorems with classifications of the Balmer spectra and thick two-sided/right ideals for the stable module categories of all small quantum groups for Borel subalgebras, and classifications of the Balmer spectra and thick two-sided ideals of Hopf algebras studied by Benson and Witherspoon.MorphismHopf algebraTriangulated categorySubcategoryZariski spaceQuantum groupStable module categoryIsomorphismTensor productCompact star...
- In this thesis, we present a flexible framework for specifying and constructing operads which are suited to reasoning about network construction. The data used to present these operads is called a \emph{network model}, a monoidal variant of Joyal's combinatorial species. The construction of the operad required that we develop a monoidal lift of the Grothendieck construction. We then demonstrate how concepts like priority and dependency can be represented in this framework. For the former, we generalize Green's graph products of groups to the context of universal algebra. For the latter, we examine the emergence of monoidal fibrations from the presence of catalysts in Petri nets.Grothendieck constructionNetwork theoryNetwork modelGraph productFibrationAlgebraNetworks...
- COVID-19 has caused thousands of deaths around the world and also resulted in a large international economic disruption. Identifying the pathways associated with this illness can help medical researchers to better understand the properties of the condition. This process can be carried out by analyzing the medical records. It is crucial to develop tools and models that can aid researchers with this process in a timely manner. However, medical records are often unstructured clinical notes, and this poses significant challenges to developing the automated systems. In this article, we propose a pipeline to aid practitioners in analyzing clinical notes and revealing the pathways associated with this disease. Our pipeline relies on topological properties and consists of three steps: 1) pre-processing the clinical notes to extract the salient concepts, 2) constructing a feature space of the patients to characterize the extracted concepts, and finally, 3) leveraging the topological properties to distill the available knowledge and visualize the result. Our experiments on a publicly available dataset of COVID-19 clinical notes testify that our pipeline can indeed extract meaningful pathways.COVID 19Feature spaceConjunctionStatisticsTopological invariantQuantum field theoryPersistent homologyComputational linguisticsOne particle irreducibleFeynman diagrams...
#### Metric monadsver. 3

We develop universal algebra over an enriched category $\mathcal K$ and relate it to finitary enriched monads over $\mathcal K$. Using it, we deduce recent results about ordered universal algebra where inequations are used instead of equations. Then we apply it to metric universal algebra where quantitative equations are used instead of equations. This contributes to understanding of finitary monads on the category of metric spaces.Metric spaceFinitaryMorphismIsometryFactorization systemPartially ordered setSubcategoryLawvere theoryEnriched categoryConcrete category...- Graphical languages are symmetric monoidal categories presented by generators and equations. The string diagrams notation allows to transform numerous axioms into low dimension topological rules we are comfortable with as three dimensional space citizens. This aspect is often referred to by the Only Topology Matters paradigm (OTM). However OTM remains quite informal and its exact meaning in terms of rewriting rules is ambiguous. In this paper we define three precise aspects of the OTM paradigm, namely flexsymmetry, flexcyclicity and flexibility of Frobenius algebras. We investigate how this new framework can simplify the presentation of known graphical languages based on Frobenius algebras.GraphMorphismMonoidUniversal propertyInteractive theorem proverGroup algebraQubitEpimorphismGraph isomorphismIsomorphism...
- Conditional distributions, as defined by the Markov category framework, are studied in the setting of matrix algebras (quantum systems). Their construction as linear unital maps are obtained via a categorical Bayesian inversion procedure. Simple criteria establishing when such linear maps are positive are obtained. Several examples are provided, including the standard EPR scenario, where the EPR correlations are reproduced in a purely compositional (categorical) manner. A comparison between the Bayes map and the Petz recovery map is provided, illustrating some key differences.MorphismBc mesonBayesianDensity matrixFinite-dimensional algebraOperator systemTensor productBayes' theoremAharonov-Bohm effectMixed states...
- Using convex Grothendieck fibrations, we characterize the von Neumann entropy as a functor from finite-dimensional non-commutative probability spaces and state-preserving *-homomorphisms to real numbers. Our axioms reproduce those of Baez, Fritz, and Leinster characterizing the Shannon entropy difference. The existence of disintegrations for classical probability spaces plays a crucial role in our characterization.EntropyMorphismHomomorphismFibrationOpacityIsomorphismConvex combinationVon neumann entropyDensity matrixFinite-dimensional algebra...
- We introduce quantum Markov categories as a structure that refines and extends a synthetic approach to probability theory and information theory so that it includes quantum probability and quantum information theory. In this broader context, we analyze three successively more general notions of reversibility and statistical inference: ordinary inverses, disintegrations, and Bayesian inverses. We prove that each one is a strictly special instance of the latter for certain subcategories, providing a categorical foundation for Bayesian inversion as a generalization of reversing a process. We unify the categorical and $C^*$-algebraic notions of almost everywhere (a.e.) equivalence. As a consequence, we prove many results including a universal no-broadcasting theorem for S-positive categories, a generalized Fisher--Neyman factorization theorem for a.e. modular categories, a relationship between error correcting codes and disintegrations, and the relationship between Bayesian inversion and Umegaki's non-commutative sufficiency.BayesianStatistical inferenceInformation theoryNo-broadcast theoremSubcategoryQuantum information theoryProbability theoryProbability...
- The approximate graph colouring problem concerns colouring a $k$-colourable graph with $c$ colours, where $c\geq k$. This problem naturally generalises to promise graph homomorphism and further to promise constraint satisfaction problems. Complexity analysis of all these problems is notoriously difficult. In this paper, we introduce two new techniques to analyse the complexity of promise CSPs: one is based on topology and the other on adjunction. We apply these techniques, together with the previously introduced algebraic approach, to obtain new NP-hardness results for a significant class of approximate graph colouring and promise graph homomorphism problems.GraphHomomorphismGraph homomorphismAlgebraic theoryHypergraphBipartite networkChromatic numberUndirected graphConstraint Satisfaction ProblemGroup homomorphism...
- Contextuality is a key feature of quantum mechanics. We present the sheaf-theoretic approach to contextuality introduced by Abramsky and Brandenburger, and show how it covers a range of logical and physical phenomena "at the borders of paradox".Quantum mechanicsAlice and BobBell's inequalityQuantum informationSemiringStatisticsHardy's paradoxBell's theoremHomomorphismBundle...
- Classical and exceptional Lie algebras and their representations are among the most important tools in the analysis of symmetry in physical systems. In this letter we show how the computation of tensor products and branching rules of irreducible representations are machine-learnable, and can achieve relative speed-ups of orders of magnitude in comparison to the non-ML algorithms.Machine learningIrreducible representationNeural networkClassifierRankTensor productArchitectureRepresentation theorySupport vector machineGraph...
- We introduce fusion bialgebras and their duals and systematically study their Fourier analysis. As an application, we discover new efficient analytic obstructions on the unitary categorification of fusion rings. We prove the Hausdorff-Young inequality, uncertainty principles for fusion bialgebras and their duals. We show that the Schur product property, Young's inequality and the sum-set estimate hold for fusion bialgebras, but not always on their duals. If the fusion ring is the Grothendieck ring of a unitary fusion category, then these inequalities hold on the duals. Therefore, these inequalities are analytic obstructions of categorification. We classify simple integral fusion rings of Frobenius type up to rank 8 and of Frobenius-Perron dimension less than 4080. We find 34 ones, 4 of which are group-like and 28 of which can be eliminated by applying the Schur product property on the dual. In general, these inequalities are obstructions to subfactorize fusion bialgebras.RankCategorificationPlanar algebraUncertainty principlePartial isometryGauge transformationIsomorphismUnitary transformationSubringDuality...
- Higher-order probabilistic programming languages allow programmers to write sophisticated models in machine learning and statistics in a succinct and structured way, but step outside the standard measure-theoretic formalization of probability theory. Programs may use both higher-order functions and continuous distributions, or even define a probability distribution on functions. But standard probability theory does not handle higher-order functions well: the category of measurable spaces is not cartesian closed. Here we introduce quasi-Borel spaces. We show that these spaces: form a new formalization of probability theory replacing measurable spaces; form a cartesian closed category and so support higher-order functions; form a well-pointed category and so support good proof principles for equational reasoning; and support continuous probability distributions. We demonstrate the use of quasi-Borel spaces for higher-order functions and probability by: showing that a well-known construction of probability theory involving random functions gains a cleaner expression; and generalizing de Finetti's theorem, that is a crucial theorem in probability theory, to quasi-Borel spaces.MorphismProgrammingDe Finetti's theoremProbabilistic programming languageProgramming LanguageIsomorphismSpace formRegressionInferenceStatistics...
- Morphisms in a monoidal category are usually interpreted as processes, and graphically depicted as square boxes. In practice, we are faced with the problem of interpreting what non-square boxes ought to represent in terms of the monoidal category and, more importantly, how should they be composed. Examples of this situation include lenses or learners. We propose a description of these non-square boxes, which we call open diagrams, using the monoidal bicategory of profunctors. A graphical coend calculus can then be used to reason about open diagrams and their compositions.Morphism
- Despite ample evidence that our concepts, our cognitive architecture, and mathematics itself are all deeply compositional, few models take advantage of this structure. We therefore propose a radically compositional approach to computational neuroscience, drawing on the methods of applied category theory. We describe how these tools grant us a means to overcome complexity and improve interpretability, and supply a rigorous common language for scientific modelling, analogous to the type theories of computer science. As a case study, we sketch how to translate from compositional narrative concepts to neural circuits and back again.Category theoryArchitectureBayesian brainModularityBayesian networkGenerative modelSoftwarePart-of-speechFunctional programmingInference...
- The categorical compositional distributional (DisCoCat) model of meaning developed by Coecke et al. (2010) has been successful in modeling various aspects of meaning. However, it fails to model the fact that language can change. We give an approach to DisCoCat that allows us to represent language models and translations between them, enabling us to describe translations from one language to another, or changes within the same language. We unify the product space representation given in (Coecke et al., 2010) and the functorial description in (Kartsaklis et al., 2013), in a way that allows us to view a language as a catalogue of meanings. We formalize the notion of a lexicon in DisCoCat, and define a dictionary of meanings between two lexicons. All this is done within the framework of monoidal categories. We give examples of how to apply our methods, and give a concrete suggestion for compositional translation in corpora.GrammarMorphismVector spaceGrothendieck constructionDiscrete categoryCategorial grammarComputational linguisticsMonoidMachine translationNatural language...
- This book brings new mathematical rigour to the ongoing vigorous debate on how to quantify biological diversity. The question "what is diversity?" has surprising mathematical depth, and breadth too: this book involves parts of mathematics ranging from information theory, functional equations and probability theory to category theory, geometric measure theory and number theory. It applies the power of the axiomatic method to a biological problem of pressing concern, but the new concepts and theorems are also motivated from a purely mathematical perspective. The main narrative thread requires no more than an undergraduate course in analysis. No familiarity with entropy or diversity is assumed.EntropyCategorical algebraMetric spaceGraphRenyi entropyMonoidNonnegativeEuler characteristicPermutationEnriched category...
- We go back to the roots of enriched category theory and study categories enriched in chain complexes; that is, we deal with differential graded categories (DG-categories for short). In particular, we recall weighted colimits and provide examples. We solve the 50 year old question of how to characterize Cauchy complete DG-categories in terms of existence of some specific finite absolute colimits. As well as the interactions between absolute weighted colimits, we also examine the total complex of a chain complex in a DG-category as a non-absolute weighted colimit.Chain complexCokernelMorphismIsomorphismMapping coneCompletenessG-moduleEnriched categoryTensor productCategory of chain complexes...
- This thesis takes inspiration from quantum physics to investigate mathematical structure that lies at the interface of algebra and statistics. The starting point is a passage from classical probability theory to quantum probability theory. The quantum version of a probability distribution is a density operator, the quantum version of marginalizing is an operation called the partial trace, and the quantum version of a marginal probability distribution is a reduced density operator. Every joint probability distribution on a finite set can be modeled as a rank one density operator. By applying the partial trace, we obtain reduced density operators whose diagonals recover classical marginal probabilities. In general, these reduced densities will have rank higher than one, and their eigenvalues and eigenvectors will contain extra information that encodes subsystem interactions governed by statistics. We decode this information, and show it is akin to conditional probability, and then investigate the extent to which the eigenvectors capture "concepts" inherent in the original joint distribution. The theory is then illustrated with an experiment that exploits these ideas. Turning to a more theoretical application, we also discuss a preliminary framework for modeling entailment and concept hierarchy in natural language, namely, by representing expressions in the language as densities. Finally, initial inspiration for this thesis comes from formal concept analysis, which finds many striking parallels with the linear algebra. The parallels are not coincidental, and a common blueprint is found in category theory. We close with an exposition on free (co)completions and how the free-forgetful adjunctions in which they arise strongly suggest that in certain categorical contexts, the "fixed points" of a morphism with its adjoint encode interesting information.Vector spaceRankPartial traceCategory theoryGraphMorphismTensor productUniversal propertyPartially ordered setStatistics...
- The famous biologist Robert Rosen argued for an intrinsic difference between biological and artificial life, supporting the claim that `living systems are not mechanisms'. This result, understood as the claim that life-like mechanisms are non-computable, can be phrased as the non-existence of an equivalence between a category of `static'/analytic elements and a category of `variable'/synthetic elements. The property of a system of being synthetic, understood as being the gluing of `variable families' of analytica, must imply that the latter class of objects does not retain sufficient information in order to describe said variability; we contribute to this thesis with an argument rooted in elementary category theory. Seen as such, Rosen's `proof' that no living system can be a mechanism arises from a tension between two contrapuntal needs: on one side, the necessity to consider (synthetically) variable families of systems; on the other, the necessity to describe a syntheticum via an universally chosen analyticum.Partially ordered setEpimorphismMorphismRegular categoryNo-go theoremFibrationAttentionUniversal propertyComma categoryCategory theory...