support for building a CLI application in Java. id 'application' } repositories { // Use JCenter for resolving dependencies. jcenter() } dependencies { // Use JUnit test framework. testImplementation 'junit:junit:4.13' // This dependency is used by the application. implementation 'com.google.guava:guava:29.0-jre' } application { // Define the main class for the application. mainClass = 'myproject.App' } gradle must know which class is the main class Dependencies!
'junit:junit:4.13' // This dependency is used by the application. implementation 'com.google.guava:guava:29.0-jre' implementation 'commons-io:commons-io:2.8.0' }
for each target – android/ – desktop/ – gwt/ – ios/ – core/ • One Core subproject for the actual logic for the game, target projects contains only launcher classes
domain-specific language (DSL) to define targets (android/ios...) and dependencies • When compiling, gradle reads build.gradle file that contains DSL that describes all the necessary information how to compile
your app – http://libgdx.badlogicgames.com/nightlies/docs/api/com/badlogic/g dx/Application.html • Equivalent to JFrame (Swing) or Activity (Android) • Informs your game about events such as window resizing • Developer creates a class that implements ApplicationListener, methods are called by Application – Application can be GwtApplication, IOSApplication …
This is platform specific: Android // No main public class AndroidLauncher extends AndroidApplication { @Override protected void onCreate (Bundle savedInstanceState) { super.onCreate(savedInstanceState); AndroidApplicationConfiguration config = new AndroidApplicationConfiguration(); MyGame game = new MyGame(); initialize(game, config); if(this.getApplicationListener() == game) { this.log("test", "success"); } } }
via static fields of Gdx – class – Global public variables for easy access • Example – AudioDevice audioDevice = Gdx.audio.newAudioDevice(44100, false);
gorbaTexture; private Sound soundEffect; private Music backgroundMusic; private SpriteBatch batch; @Override public void create() { // gorba.png uploaded to GPU and is ready to be used by OpenGL. Image format // must be .jpg, .png, .bmp gorbaTexture = new Texture(Gdx.files.internal("gorba.png")); // Stored in RAM soundEffect = Gdx.audio.newSound(Gdx.files.internal("beep.wav")); // Streamed from wherever it’s stored backgroundMusic = Gdx.audio.newMusic(Gdx.files.internal("soviet-anthem.mp3")); // start the playback of the background music immediately backgroundMusic.setLooping(true); backgroundMusic.play(); batch = new SpriteBatch(); } @Override public void dispose() { gorbaTexture.dispose(); soundEffect.dispose(); backgroundMusic.dispose(); }
world” – What part of the “world” is visible? • World may be bigger than visible area • Camera – OrthographicCamera • When the human eye looks at a scene, objects in the distance appear smaller than objects close by. Orthographic projection ignores this effect – PerspectiveCamera • Closer objects appear bigger in PerspectiveCamera
// When running in Desktop, the window size is // 800 x 400 pixels: // config.width = 800; // config.height = 400; // But window size may change (resize)! Or we could run the same // game in android and in ios device. The screen resolution and pixel density // varies! // It's easier to forget the "real resolution" and use "world resolution". // The world resolution is the same regardless of real resolution. // Most important thing about world resolution is the aspect ration. It // does not matter if the world resolution is 1000 x 500 or 10 x 5 as long // as you are comfortable with the aspect ratio. // If dealing with a game using Box2D it's recommendation that you use // world resolution defined in meters. So we could have a world size // 10 meters x 5 meters. // When the world width and height is set, we need to // configure what part of the world the camera is filming. // By using false, we simple state that y-axis is pointing // up and camera is centered to width / 2 and height / 2 camera = new OrthographicCamera(); camera.setToOrtho(false, 10, 5);
cross-language, multi- platform application programming interface (API) for rendering 2D and 3D vector graphics. • The API is typically used to interact with a graphics processing unit (GPU), to achieve hardware-accelerated rendering. • Widely used in CAD, virtual reality, scientific visualization, information visualization, flight simulation, and video games. • libGDX uses OpenGL ES and has interface also for direct access for OpenGL
OpenGL ES texture. – A texture is an OpenGL Object that contains one or more images that all have the same image format. • Image loaded into the GPU’s memory in raw format • Texture mapping is process of working out where in space the texture will be applied – “To stick a poster on a wall, one needs to figure out where on the wall he will be gluing the corners of the paper” – Space ó Wall – Mesh (Rectangle) ó Paper – Image on paper ó Texture
// float red [0,1] // green // blue // alpha // https://www.opengl.org/sdk/docs/man/html/glClearColor.xhtml Gdx.gl.glClearColor(0, 0, 0.2f, 1); // Clear the screen with the color chosen // http://www.opengl.org/sdk/docs/man/html/glClear.xhtml Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT); // SpriteBatch is ready for commands batch.begin(); .... // No commands anymore, proceed to process the batch of commands // received batch.end(); }
0, 400 World resolution 0, 0 Window resolution 800 x 400 World resolution set to 10 x 5 Real resolution 800, 0 World resolution 10, 5 Real resolution 800, 400 World resolution 10, 0
{ // Spritebatch uses coordinates specified by the camera! batch.setProjectionMatrix(camera.combined); if(Gdx.input.isTouched()) { int realX = Gdx.input.getX(); int realY = Gdx.input.getY(); // Encapsulated 3D Vector, only 2D is used // Vectors can be used for represent a direction and position // Bad practice to instantiate every render – call! Vector3 touchPos = new Vector3(realX, realY, 0); // Function to translate a point given in screen // coordinates to world space. camera.unproject(touchPos); Gdx.app.log("MyGame", "real X = " + realX); Gdx.app.log("MyGame", "real Y = " + realY); Gdx.app.log("MyGame", "world X = " + touchPos.x); Gdx.app.log("MyGame", "world Y = " + touchPos.y); }
rate at which an imaging device produces unique consecutive images called frames • LibGDX tries to call the render() method as fast as possible • When it reaches 60 fps, it stops, no need to update any faster • You can query the FPS: – Gdx.graphics.getFramesPerSecond()
{ x = x + speed ) • If this is called 60 times per second, the object moves fast • If this is called 10 times per second, the object moves slow! • Now your game is implemented so that the game speed varies in different mobile devices!
last update in millisecs – If fps is 60, delta time is 1 / 60 => 0.016 – If fps is 30, delta time is 1 / 30 => 0.033 • To get the delta time – Gdx.graphics.getDeltaTime() • Now use: • if(left arrow is pressed) { x = x + speed * delta )
Rectangle rectangle; – float speedX; – float speedY; • If all of your game objects hold common features, you could use inheritance – class Spaceship extends GameObject – class Alien extends GameObject • In fact we already have the "GameObject" in LibGDX, it's called Sprite