Ab-initio Study of Electronic and Lattice Dynamical Properties of monolayer ZnO under Strain (2308.00414v1)
Abstract: First-principles density functional theory based calculations have been performed to investigate the strain-induced modifications in the electronic and vibrational properties of monolayer (ML) ZnO. Wide range of in-plane tensile and compressive strains along different directions are applied to analyse the modifications in detail. The electronic band gap reduces under both tensile and compressive strains and a direct to indirect band gap transition occurs for high values of biaxial tensile strain. The relatively low rate of decrease of band gap and large required strain for direct to indirect band gap transition compared to other $2$D materials are analysed. Systematic decrease in the frequency of the in-plane and increase in the out-of-plane optical phonon modes with increasing tensile strain are observed. The in-plane acoustic modes show linear dispersion for unstrained as well as strained cases. However, the out-of-plane acoustic mode (ZA), which shows quadratic dispersion in the unstrained condition, turns linear with strain. The dispersion of the ZA mode is analysed using the shell elasticity theory and the possibility of ripple formation with strain is analysed. The strain-induced linearity of the ZA mode indicates the absence of rippling under strain. Finally, the stability limit of ML-ZnO is investigated and found that for $18\%$ biaxial tensile strain the structure shows instability with the emergence of imaginary phonon modes. Furthermore, the potential of ML-ZnO to be a good thermoelectric material is analyzed in an intuitive way based on the calculated electronic and phononic properties. Our results, thus, not only highlight the significance of strain-engineering in tailoring the electronic and vibrational properties but also provide a thorough understanding of the lattice dynamics and mechanical strength of ML-ZnO.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.