Recent advances in artificial intelligence have accelerated research in AI-driven materials discovery. However, the development of generalized predictive models remains a major challenge in this domain due to the scarcity of property data and the heavy bias of existing datasets toward specific chemical spaces. To address these issues, we pretrain models on large-scale molecular structure data and implement techniques that enable effective information exchange across various property prediction tasks. Our approach facilitates the learning of broadly applicable representations, improving generalization to unseen chemical domains. The resulting model, EXAONE Discovery, serves as a foundation model for materials science, capable of supporting diverse material development tasks. By leveraging large-scale pretraining and cross-task learning strategies, EXAONE Discovery demonstrates strong potential for accelerating innovation across multiple material-related fields.