Given a manifold , a divergence function is a non-negative function on the product manifold that achieves its global minimum of zero (with semi-positive definite Hessian) at those points that form its diagonal submanifold . It is well-known that the statistical structure on (a Riemmanian metric with a pair of conjugate affine connections) can be constructed from the second and third derivatives of evaluated at . In Zhang (2004) and subsequent work, a framework based on convex analysis is proposed to unify familiar families of divergence functions. The resulting geometry, which displays what is called ``reference-representation biduality'', completely captures the alpha-structure (i.e., statistical structure with a parametric family of conjugate connections) of the classical information geometry. This is the alpha-Hessian geometry with equi-affine structure. Here, we continue this investigation in two parallel fronts, namely, how on a) is related to various Minkowski metrics on ; and b) generates a symplectic structure of . On point a, a set of inequalities are developed that uniformly bounds by Minkowski distances on . On point b, convex-induced divergence functions will be shown to generate a K\"ahler structure under which the statistical structure of can be modeled.