Boltzmann-Shannon entropy leads to an exponential model as the maximum entropy model with the constraint to the space of pdfs under which expectations of a given statistic become a common vector. The maximum likelihood estimator for the expectation parameter of under the exponential model is characterized by specific properties such as the attainment the Cramer-Rao bound. Any generator function defines -entropy and -divergence from the assumption of convexity of . In this framework, -entropy leads to -model as the maximum entropy model under which the minimum -divergence estimator for the expectation parameter is characterized by a structure of orthogonal foliation. If , then this reduces to the case of Boltzmann-Shannon entropy. Surprisingly, we observe that the minimum -divergence estimator under the -model has a unique form, that is, the sample mean of . Alternatively if the minimum -divergence estimator is employed under another -model, then the estimator has a different form with the weighted mean of over the sample. This talk discusses information geometric understandings for this aspect with Pythagoras identity, minimax game and robustness.