Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to inference online with tensorflow2.0? #24

Open
freefuiiismyname opened this issue Dec 18, 2019 · 1 comment
Open

how to inference online with tensorflow2.0? #24

freefuiiismyname opened this issue Dec 18, 2019 · 1 comment

Comments

@freefuiiismyname
Copy link

freefuiiismyname commented Dec 18, 2019

i am trying to inference online with tensorflow2.0. my code is as follows:

    self.graph = tf.Graph()

    with self.graph.as_default() as g:
        self.input_ids = tf.compat.v1.placeholder(tf.int32, [FLAGS.batch_size,
                                                             FLAGS.max_seq_length], name="input_ids")
        self.input_mask = tf.compat.v1.placeholder(tf.int32, [FLAGS.batch_size,
                                                              FLAGS.max_seq_length], name="input_mask")
        self.p_mask = tf.compat.v1.placeholder(tf.float32, [FLAGS.batch_size,
                                                            FLAGS.max_seq_length], name="p_mask")
        self.segment_ids = tf.compat.v1.placeholder(tf.int32, [FLAGS.batch_size,
                                                               FLAGS.max_seq_length], name="segment_ids")
        self.cls_index = tf.compat.v1.placeholder(tf.int32, [FLAGS.batch_size], name="segment_ids")
        self.unique_ids = tf.compat.v1.placeholder(tf.int32, [FLAGS.batch_size], name="unique_ids")

        # unpacked_inputs = tf_utils.unpack_inputs(inputs)
        self.squad_model = ALBertQAModel(
            albert_config, FLAGS.max_seq_length, init_checkpoint, FLAGS.start_n_top, FLAGS.end_n_top,
            FLAGS.squad_dropout)

        learning_rate_fn = tf.keras.optimizers.schedules.PolynomialDecay(initial_learning_rate=1e-5,
                                                                         decay_steps=10000,
                                                                         end_learning_rate=0.0)
        optimizer_fn = AdamWeightDecay
        optimizer = optimizer_fn(
            learning_rate=learning_rate_fn,
            weight_decay_rate=0.01,
            beta_1=0.9,
            beta_2=0.999,
            epsilon=1e-6,
            exclude_from_weight_decay=['layer_norm', 'bias'])

        self.squad_model.optimizer = optimizer
        graph_init_op = tf.compat.v1.global_variables_initializer()

        y = self.squad_model(
            self.unique_ids, self.input_ids, self.input_mask, self.segment_ids, self.cls_index,
            self.p_mask, training=False)
        self.unique_ids, self.start_tlp, self.start_ti, self.end_tlp, self.end_ti, self.cls_logits = y

        self.sess = tf.compat.v1.Session(graph=self.graph, config=gpu_config)
        self.sess.run(graph_init_op)
        with self.sess.as_default() as sess:
            self.squad_model.load_weights(FLAGS.model_dir)

This code is executable, but it runs bad result. It looks like the parameters are unloaded.I guess this is probably because I'm not using tf.Session to set default parameters on the model, such as' saver.restore(sess, tf.train. Latest_checkpoint (init_checkpoint)) '.
I've tried several ways to do this, but it hasn't worked.And there are very few examples of online inferencing using tensorflow2.0 on the Internet, and I have trouble finding a solution. :((((
May i get some help here, thx very much!!

@Bidek56
Copy link

Bidek56 commented Jan 3, 2020

This code works to inference a single value from a saved model, hopefully it helps.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants